Autopilot (Self-Driving) Tesla Model S Records First Auto Crash


Future Cars: A driver who was driving a TESLA MODEL S  (technology that remains in beta testing) has just died in a fatal crash while using the car’s semi-autonomous Autopilot (self-driving) feature.

Fueled by the belief that computers can operate a vehicle more safely than human drivers has increased the pace at which automakers and technology firms develop self-driving cars.

See Also: Tesla Model 3 Among Future Cars; Sweet Rides For 2017 And Beyond! 

The accident is reportedly the first known fatality involving a vehicle being driven by itself by means of sophisticated computer software, sensors, cameras and radar. Investigations revealed on Thursday that the driver of the Tesla Model S electric sedan was killed in an accident when the car was in self-driving mode.

According to the New York Times, the National Highway Traffic Safety Administration said preliminary reports indicated that the accident occurred when a tractor-trailer made a left turn in front of the Tesla, and the car failed to automatically apply the brakes.

The driver was identified as 40-year old Joshua Brown, of Canton, Ohio, a Navy veteran who owned a technology consulting firm. Tesla motors described him as a man “who spent his life focused on innovation and the promise of technology and also believed strongly in Tesla’s mission.”

Tesla Model S autopilot
Tesla Model S Electric sedan in its Autopilot (self-driving) mode enabled. Driver can choose to drive the vehicle with hands off the car steering.


About Tesla’s Autopilot Motors

The Tesla, as noted, disables its Autopilot by default and therefore requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. Hence, the Silicon Valley auto builders instruct drivers to keep their hands on the steering wheel at all times and be ready to assume complete control at any moment.

Tesla says:

“We do this to ensure that every time the feature (Autopilot) is used, it is used as safely as possible.”

They argues that even though it’s not perfect, “the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety.”

See Also: Toyota Recalls 1.43 Million Cars Over Faulty Airbags

Other reports say shortly Tesla activated Autopilot via an over-the-air software update in October 15, 2015, people were posting videos of themselves doing all kinds of stunts including sitting in the back seat or even sleeping – while the car drives itself. And before long, other few people had drove Tesla Model S cross-country in less than 58 hours, using the same Autopilot (self-driving) to race at up to 90 metres per hour (mph).

So the rhetorical question remains: “Who is at fault? the car, the road or the driver?. The international law is not against self-driving cars, as quoted by technological expert, Bryant Walker Smith.

Hear him:

Companies can get away with a lot that’s in a legal gray area, as long as nothing bad happens, otherwise, regulations would intervene when something goes awry”

Meanwhile, the federal traffic safety agency is nearing the release of a new set of guidelines and regulations regarding the testing of self-driving vehicles on public roads. They are expected to be released this July.