Tesla said Thursday a driver, identified as Joshua Brown, was killed while using its "autopilot" self-driving mechanism on its Model S electric car, leading to a federal safety investigation.
Tesla said the National Highway Traffic Safety Administration (NHTSA) had opened a "preliminary evaluation" into the performance of Autopilot after the electric car company notified the agency of the fatality.
In a statement, Tesla said the fatality was "a tragic loss" and was the first such incident with its Autopilot system activated.
"This is the first known fatality in just over 130 million miles (209 million kilometers) where Autopilot was activated," the company said.
"Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles," it said.
"It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations."
Tesla says Model 3 orders top $10 billion in first 36 hours
Tesla, known for its high-end electric cars, unveiled the system last year which allows the vehicle to automatically change lanes, manage speed and even hit the brakes. The system is activated and overridden by the driver.
Tesla said that in the fatal accident, the car was on a divided highway when a tractor-trailer drove across the road to be situated perpendicular to the Tesla.
"Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied," the statement said.
"The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
Tesla said that if there had been an impact against the front or rear of the trailer, even at high speed, "its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents."
Tesla said it followed "our standard practice" by informing federal safety officials.
Self-driving cars in a fast lane: Fiat Chrysler chief
"It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled," Tesla said.
When drivers activate the system, they see a warning saying that it is "an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle" while using it, according to the company.
Deceased Tesla driver earlier credited auto pilot system for saving his life
Earlier in April, Brown, who is an ex-US Navy technician, uploaded a video to YouTube claiming his Tesla S's autopilot had saved his life from a crash with another truck.
"The autopilot saved the car autonomously from a side collision from a boom lift truck," explained Brown on his video's page.
"I actually wasn't watching that direction and Tessy (the name of my car) was on duty with autopilot engaged. I became aware of the danger when Tessy alerted me with the 'immediately take over' warning chime and the car swerving to the right to avoid the side collision.
Step right up: Lincoln unveils huge luxury Navigator SUV
"I have done a lot of testing with the sensors in the car and the software capabilities. I have always been impressed with the car, but I had not tested the car's side collision avoidance. I am VERY impressed. Excellent job Elon!"
The news comes amid growing interest in self-driving cars following tests over the past few years by Google and research by several major automakers.
Investments are also being made in autonomous trucks and small buses.
A major study released earlier this month said the looming arrival of self-driving vehicles is likely to vastly reduce traffic fatalities, but also poses difficult moral dilemmas.
The scientists said autonomous driving systems will require programmers to develop algorithms to make critical decisions that are based more on ethics than technology, such as whether to sacrifice a driver or passenger rather than pedestrians.
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ