YouTube has taken down at least two videos showing Tesla Full Self-Driving being tested to see if it would stop when detecting children in front of the vehicle. The videos were uploaded by known pro-Tesla content creators who apparently used their own children to gauge how FSD reacted to seeing kids on the road.

Several Tesla owners and fans of the brand say they performed the tests in order to refute claims that when FSD is activated, vehicles would drive into children or smaller obstacles. Some used mannequins (and even a child-sized and shaped cardboard cutout), whilst other used their actual children for the test, alerting YouTube which pulled the videos due to it being an activity that put people in danger.

This was all started after The Dawn Project, which claims to be a “safety advocacy and research organisation” created an ad campaign showing video of a Tesla Model 3 with FSD engaged repeatedly running over a child-sized mannequin, without even slowing down. The campaign’s message was to ask US Congress to not allow FSD to be used on public roads.


This so-called experiment by The Dawn Project didn’t seem very truthful, given that the vehicle did not detect the child-sized obstacle right in front of it, powering over the mannequin, making for quite a dramatic visual spectacle. However, there is a strong chance that whilst the driver did appear to have hands off the steering wheel, the vehicle’s accelerator pedal may have been pressed to induce this behaviour...

Knowing that whilst FSD is not perfect, the campaign by The Dawn Project seemed to skew reality, pro-Tesla YouTubers like Whole Mars Catalog resorted to putting their children at risk just to disprove it. Now of course they all should have used the same kind of mannequin like the one in the anti-FSD campaign, but they probably figured this would get more attention and drive the point home.

But this was bound to backfire given how strict YouTube is when it comes to showing people in danger of getting hurt and CNBC reached out to the Google-owned video sharing platform to ask about these videos and why they were removed. It got a statement back saying that

YouTube doesn’t allow content showing a minor participating in dangerous activities or encouraging minors to do dangerous activities. Upon review, we determined that the videos raised to us by CNBC violate our harmful and dangerous policies, and as a result we removed the content.

Tesla still has a lot of work to put into FSD before it is deemed completely safe and fool-proof, but it is nowhere near as bad as some critics (who are oftentimes Tesla short sellers) would like you to believe - it is even occasionally quite impressive. The latest public beta is version 10.69, whose rollout among beta testers enrolled in the FSD programme began on August 20.