A fan of Tesla might think that the automaker just can’t catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars’ performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.
The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.
What the car can “see” is the big issue here. It’s also what Tesla bet its future on.
“Humans only need visible light”
Humans are bad drivers as well. Technology should try to do better than humans, not accept the limitations of humans. When Radar, lidar (and others - possibly including things not invented yet) exist we should use them to make cars safer.
Humans can move their heads to avoid glare. They can shield glare from their eyes with visors.
Tesla cameras currently can’t do either.
Musk is of course right. The “only” thing he forgot was that his vision-only model needs full human level artificial intelligence behind it to work.
Very genius.
Musk is also forgetting that humans use other senses when driving, not just their sight.
I use echolocation by screaming at other drivers.
And even then, it would only be able to drive as well as a human, and humans kill tons of people on the highways.
If the Wright Brothers used the same logic their flying machines would have all flapped…