CLICK TO SHARE
With Tesla's latest FSD ("Full Self-Driving") release, it's asking drivers to consent to allowing it to collect video taken by a car's exterior and interior cameras in case of an accident or "serious safety risk." That will mark the first time Tesla will attach footage to a specific vehicle and driver, according to an Electrek report.
Tesla has gathered video footage as part of FSD before, but it was only used to train and improve its AI self-driving systems. According to the new agreement, however, Tesla will now be able to associate video to specific vehicles. "By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision," the agreement reads.
As Electrek notes, the language could indicate that Tesla wants to ensure it has evidence in case its FSD system is blamed for an accident. It could possibly also be used to detect and fix serious issues more quickly.
FSD 10.3 was released more widely than previous betas, but it was quickly pulled back due to issues like unwarranted Forward Collision Warnings, unexpected autobraking and more. At the time, CEO Elon Musk tweeted that such issues are "to be expected with beta software," adding that "it is impossible to test all hardware configs in all conditions with internal QA, hence public tests."
However, other drivers on public roads are unwitting beta testers, too. The National Highway Traffic Safety Administration is currently investigating a driver's complaint that FSD led to a November 3rd collision in Brea, California. The owner alleged that it caused his Model Y to enter the wrong lane and hit another car, causing considerable damage to both.
If you don't see any comments yet, congrats! You get first comment. Be nice and have fun.
CLICK TO SHARE