Teslas running Autopilot involved in 273 crashes reported since last year:

Tesla vehicles running its Autopilot software have been involved in 273 reported crashes over roughly the past year, according to regulators, far more than previously known and providing concrete evidence regarding the real-world performance of its futuristic features.

The key bit to understanding Autopilot at this stage of development is that the driver must continue to supervise car operations. It’s not a “set it and forget it” sort of tech. My Model 3 has the 2.5 hardware (which is older) and Enhanced Autopilot. It does great—not good, great—on the freeways. The car drove at least 90% of my recent trip from Salem to Boise, Idaho. We had one phantom braking event where the car suddenly slowed down during a lane change. I pressed the accelerator and took over control. It was not what I would consider a dangerous event (as there were no other cars nearby). 

The difference between driving and supervising is profound. I arrive a destinations much more alert and energy-filled than if I have to drive. I love Tesla’s Enhanced Autopilot, even as I acknowledge its imperfections. 

I don’t know the details of the Autopilot crashes, but I’d throw out all cases where the human was not supervising. You can’t safely use Autopilot and play video games, watch DVDs, etc. If you choose to, well, that’s not Tesla’s fault. Autopilot has successfully driven millions of miles in hundreds of thousands (if not millions) of cars. Used as intended, which is to say supervised, it’s a fantastic tool. 

I’m fine with a federal investigation of Autopilot crashes. They should be investigated. But 273 crashes is not significant when placed in the appropriate context. Let’s hope the feds do that.