Tesla may soon launch the latest FSD (fully automated driving) autopilot to a large group of drivers, and US security officials are not so happy with it. They have reason to worry, according to a new MIT study seen by TechCrunch. The researchers studied the data and found that drivers become more careless when using Tesla's autopilot system.
The fact that drivers may not pay as much attention to the road when using autopilot is not a shock. What is new is that the researchers were able to see exactly where the drivers were looking when the autopilot was activated.
Off-road gazes were directed downwards and towards the center stack area, therefore "probably not related to driving". Instead, searching in these directions is usually associated with activities such as looking down at a smartphone or the interaction with the touch screen. These often lasted longer with Autopilot enabled and were much more common than off-road glances at manual driving, according to the newspaper.
Despite the name, Tesla's FSD (fully automatic driving) autopilot is just a driver assistance system and far from being fully autonomous. Therefore, it requires drivers to have their hands on the steering wheel and stay fully awake, but Tesla does not use cameras or other means to monitor the alert part.
The latest version, 10.0.1, is supposed to make more secure driving decisions, but so far it has only been released to a relatively small group of beta testers. However, Tesla plans to release it more widely from September 24 and may release it on all EVS of the Autopilot FSD, pending a seven-day test that will monitor the driver's driving habits.
However, the new version could bring Tesla against US regulators. The head of the National Transportation Safety Board (NTSB), Jennifer Homendy, recently stated that Tesla should not release the latest software update until it has security issues“. He was also not thrilled with the fact that Tesla is testing on public roads.
Source of information: au.finance.yahoo.com