
A year and a half after the first fatal crash involving an autonomous vehicle, the National Transportation Safety Board published its findings and found that the driver of the truck that the self-driving car collided with was partially at fault for the accident.
Joshua Brown’s Tesla Model S was in Autopilot mode when it failed to “see” a truck making a left turn across the highway in front of it. The Tesla hit right in the center of the trailer, its top was shorn off, and Brown died on impact. Brown became the first person to die in a car that was driving itself.
Tesla claims that its autopilot system failed to recognize the side of the trailer since it had not been programed to do so. This is because the Tesla Autopilot was only supposed to be used on highways with clear lane markings, strict medians, and exit and entrance ramps. This warning is issued to drivers when they engage the system.
The part of Highway 27 near Williston, Florida that the accident occurred on is a four-lane highway with no median that goes right through the center of towns along its route.
Back in January, the National Highway Traffic Safety Administration published its report on the crash. It claimed that because Brown wasn’t paying attention to the road, he was responsible for the crash. The NHTSA concluded that the Autopilot hadn’t malfunctioned – it was never supposed to have been used in that situation in the first place.
The NTSB report published this month sees things differently. They agreed that it was Brown’s responsibility to remain alert while the Tesla was in Autopilot mode. But they also placed some blame on Tesla for not putting sufficient processes in place to ensure that its drivers are still paying attention to the road while the Autopilot feature is engaged.
Additionally, the NTSB placed some blame on the truck driver. They found him partially responsible for the accident since when he made a left turn across two lanes of the highway, he failed to yield the right of way when crossing the intersection. They also noted that the driver tested positive for marijuana use though “his level of impairment, if any, at the time of the crash could not be determined.”
The crash therefore was deemed by the NTSB to be the fault of Brown, Tesla, and the driver of the truck that the Tesla drove in to.
The NTSB also issued a series of recommendations to the DOT, the NHTSA, and to the manufacturers of all “vehicles equipped with Level 2 vehicle automation systems.” You can see the recommendations and the full report here.
On Monday, Brown’s family issued a statement in defense of Tesla which was picked up by Reuters.
“We heard numerous times that the car killed our son,” the statement reads. “That is simply not the case. There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.”
“People die every day in car accidents,” the statement continued. “Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”
Source: ntsb, overdrive, truckinginfo, thenational, reuters, wired, arstechnica, ntsb, truckersreport


Well, Brown learned the hard way about not paying attention . No sympathy here. Sorry.
Not sure I agree. If a driver knowingly tried to cross in front of a car that close with a overweight truck, he should be looking at jail time. I was going to say unless he just left the shipper, but he would have had all the opportunities to figure out something wasn’t right about the load before he left. Air pressure on gauges, appearance of tires squatting, amount of brake distance and acceleration, trailer sway, etc…
This was just a dumb and dangerous move that shouldn’t have happened.
Where in the story did it say he was overweight? Unless I missed something.
What? Where does it say the truck was over weight?
While I have always maintained that the truck driver failed to consider the closure rate of the Tesla when he turned in front of it. in this example I feel that the truck driver is a major contributor to the fatality at least 50% at fault. That being said no where have I ever read that the semi was over weight.
What does weight have to do with it? They never said he was overweight. They said he was making a left turn. Some of the blame is on the trucker. Another example of a fatal crash involving a 4 wheeler not paying attention and trucker gets blamed. We can’t stop, turn or accelerate fast. PAY ATTENTION when you are driving. Like the song goes, keep your eyes on the road, keep your hands upon the wheel.
You got your stories mixed up. This Tesla accident had nothing to do with the scrap truck that was overweight.
This article didn’t say anything about an over weight truck.
the truck was in front of the car making a left turn, not crossing in front of Tesla! also the overweight was a different story
No sympathy ? Sounds like autonomous machines can also write comments.
Hello CEO. Do you know who works for you? Better check the roster and do some research before they kill someone. 😎
Don’t get played again by big interest groups and their agenda…… Seems like this story has changed just in time for this push for autonomous vehicles…..And getting hit in the center of the trailer means there probably was ample time to avoid any collision, unless the car was really speeding .
I see who is driving trucks now. They are the type of people that make you box up your cb radio and clip your coax. They are the type of people that make you park in the slightest amount of precipitation because you can’t trust their logic. They are the type of people that demand you get out of your truck in the middle of the night to help them back in because they can’t do it alone.
This is the quality of the new driver that companies don’t see. And they should…
I agree but as stated above. If he hit the center of the trailer there was plenty of warning. Auto pilot failed and the driver did not brake…. at all! I’ve seen a Tesla make an emergency stop and it was incredible. People make driving errors all day long both 4 wheelers and truck drivers so stop generalizing. The quality of driving for all demographics is declining. It’s exactly why I would love to see a future where self driving cars are mandatory.
Not with current technology. I have onguard collision management and it malfunctions of the time.
OnGuard Technology is still flawed. It’s very scary when your truck locks up in snow just because you passed under a bridge.
Sure does!
Hear hear!!
By the time the technology is really here we’ll be begging people to stop driving themselves driver compatance is dropping fast
Quality of all drivers.
If I remember correctly, the Tesla was going at least 10 miles over the posted limit.
I agree, not once did they mention the speed of the car, not seeing the trailer it may have increased speed.
Wow! Did you read the same article? Nowhere in that article did it say he was overweight. He tested positive for pot, but they didn’t determine his level of impairment, so that may or may not have been a factor. As for putting down other truck drivers, sorry Billy Big Rigger, I didn’t realize you were born with the ability to back a truck and knew everything about it when you sat behind the wheel for the first time. The problem as I see it, is the lack of patience and time to help the new drivers that we once had out here. The general poor attitude. Everyone starts somewhere. Most drivers with 10 or so years experience, get complacent and forget those things. You were there once. I have 23 years experience, drive for one of the safest fleets in the industry, and even our guys make stupid mistakes when they aren’t paying attention. So as the good Lord said, “Let he without sin cast the first stone.”
Why do people bring some imaginary person into the conversation.
Check your history. He was quoting Jesus. A real human that actually existed.
He was probably replying about the article for the rig that was 86 thousand pounds over weight. I had that happen to me where I read a article and replied to it it but it ended up on a different article
I agree
I’ve had a commercial license since 1977, Chuck. Where do you get off saying most drivers with a certain level of experience get complacent? Call me when you have 6 million miles no accidents…..as far as safest fleets goes, they’re only as safe as the drivers they hire. Putting massive restrictions on employees is what leads to complacency, because you allow the company to do your thinking for you
Exactly, I see truckers out there taking videos and laughing. Instead of getting their fat ass out of the truck to help, why? Because you were once there!!
Amen! I have driven since early eighties and you are correct, we as drivers don’t bring up and help the new drivers. I do remember how hard it was for me when I first started. Thanks to the old timers that took the time to impart their knowledge upon me. Of course after they ribbed me a little for being a rookie. I learned from all those fellas and gals! That is what is wrong with drivers now!
I agree! I was hit while docked by an O/O. There are many Super Truckers out here that mess up all the time because they’ve been there, done that and don’t need to slow down. There are many hit and runs at TS’s and its not all rookies doing it either! Just wait until winter when the big boys go full speed on the ice then wonder how or why they Jacked. Used to be only worrying about 4 wheelers for the most part, now it’s them, DOT, other drivers…etc. I don’t get into the BS. I go, do what I have to and get back home God willing.
I love how the Browns family isn’t putting any blame on the car or the truck driver. I would say they got paid very well from Telsa.
Bruce, my thoughts too. They responded with what sounds like a sound byte from one of the Bell Telephone Company “Futures of Innovation” type movies from the 60s… all veey noble and forward thinking. And well compensated….
Bingo!!
I’d say they sure did
Could we just stop Autonomous Vehicles Cars and Trucks right now before this becomes more regular situation of machines and computer software being the fault of major accidents. May LAW Makers not be bought by big business and or increase liability law suits to include them 100%…
Instead of calling it an autonomous vehicle they should label it “cruise control plus”.
I personally think the Tesla saw the distance between the dolly legs and the first axle as clear and drove straight through, decapitating its driver.
As far as whether the truck driver should have turned or not: sometimes you have to just make the move or you will be sitting there all day. People are ignorant pieces of shizz and won’t let you in, so, if there is ample space for them to see you, react and slow or stop, you DO make the move. I deal with it every day. People won’t yield to you or anyone else, even when they’re supposed to, if you’re still stationary.
I’m guessing that the street the tesla was on had the right of way based on what I read. That being the case there are drivers out here ecperienced or not who don’t have the patience to wait before just pulling out into an intersection and assume the cross traffic will just slow down while he or she completes their turn. Well that becomes more dangerous when loaded. So I’m only guessing that’s why they said the driver was partially at fault. However that being said not programming the Tesla to recognize such dangers was a major error on the part of whom ever set it up. It’s like setting a car up to do 140 mph and assuming the guy who buys the car will adhere to the rules and only speed on the interstate. Wishful thinking. I’m sorry for the families loss.
If a terrorist wanted to have fun these automated vehicles would be a good place to create havoc.
I wonder if there was a lot of traffic on route 27 and just for the record in the truck drivers defense ,In some situations the driver has to put his self out there knowing there is Too much traffic to make a safe turn so you have to put your nose out there in hopes that the traffic will stop or at least slow down so you can get safely across the road and this would not make the truck driver at fault. The Tesla car and the driver Mr. Brown And as was stated already the technology was not meant to be used in this situation .
Exactly. Brown didn’t follow the guidelines for his vehicle. But even if he was using the autopilot on a section of highway that he shouldn’t have been, if he had been paying attention he could have avoided the collision.
None of the fault lies with the truck driver. The autonomous system was inadequate to the situation, and Brown had been warned of the shortcoming. He should have taken manual control in that situation and avoid the problem. Trucks are better off taking left turns from the right lane to better see the situation on the sight side. As a driver myself, that is how I would have done it. To me it seems the driver saw Brown’s vehicle and judged that the car’s relative speed at the time was safe to make the turn. Of course, he had no idea the car was autonomous and never expected the car to keep accelerating into him. Nor would I in a similar situation. This is a classic case of a rush to judgement, following the age old idea that truck drivers are always at fault.
There will always be a problem with driver alertness when in autopilot mode…for goodness sakes you can’t keep most idiots from texting while they are in FULL control of the vehicle. How the hell do you keep them from doing it in auto mode. They will be texting, emailing, and having sex while ‘driving’. When this gets going the highway death rate will go crazy! Same with trucks! This b.s. Technology should be stopped NOW!!
The guy was watching a movie, for hells sake….
The first priority and responsibility when driving is to drive safely,
whether it be a truck or a car. What company would not teach
the Smith Rules i.e: Get the Big Picture. Those same rules
should be in every Drivers manual and training video for autos/pick ups,etc,
And on the test! And we should be seeing the NTSB and NHTSA
I find that the marijuana in the driver’s system to be the most damning
evidence against the driver.. Tested for impairment or not.
But you know, grandma or any one out there driving impaired by prescription
drugs or other substances are just as dangerous as this man in the Tesla
watching a movie, as first reported about the accident. The Brown family
sounds as if they are not about and eye for an eye and probably realized
that watching video screens while traveling is best relegated to the back seat.
Any time you mix man and machine, the dirty six letter word called safety
overrides common sense, and with humans, you end up with Chaos.
If you think Chaos cannot repeat itself, just look at human history.
There is a steep learning curve for stupid.
A lesson for the makers of Autonomous vehicles of any kind.
…Autopilot…Too funny. Here’s the next problem coming in the next few years for the commercial driver. Wait for the day your out in say Wyoming or Utah and being passed by a Teslar and God’s gust of wind decides to give a blow. And Autopilot in the Teslar doesn’t react. The truck moves 4 feet, over the top of the Autopilot Teslar. You the commercial driver will be a fault because you had coffee in your coffee cup. Because the narrative being sold as you can see from this. Even though Autopilot can’t tell a 53 ft trailer sideways in front of it. The human driver is somewhat still to blame…
They don’t trust people driving, which is why we have so much regulations, but they’re willing to put glitch prone computers in the driver seat. Even though this tesla didn’t glitch, it still didn’t think to stop with a huge object in front of it. Hmmm…
If your too damn lazy to operated your car and pay attention by yourself then maybe you should hire a chauffeur. I understand autopilot is probably here to stay once they work the bugs out of it . I just wouldn’t want to be a guinea pig for Autopilot testing . Autopilot = autopsy . Don’t get me wrong , I wish I had a Tessa to drive . Way cool cars . But I would’ve been driving . Not some unproven technology. What’s next ? Automous motorcycles? Let’s be safe out there on the roads .
If your too damn lazy to operate your car and pay attention by yourself then maybe you should hire a chauffeur. I understand autopilot is probably here to stay once they work the bugs out of it . I just wouldn’t want to be a guinea pig for Autopilot testing . Autopilot = autopsy . Don’t get me wrong , I wish I had a Tessa to drive . Way cool cars . But I would’ve been driving . Not some unproven technology. What’s next ? Automous motorcycles? Let’s be safe out there on the roads .