In fact, the reason that the accident happened is very obviously stated in the owner's manual.
If I haven't hammered this idea into your head hard enough yet, I'll say it again: Tesla's Autopilot, and other advanced driver-assistance systems, require the human to be attentive at all times. When you don't pay attention, dumb things happen, and everybody's going to see right through your excuses. That's what happened with the first Autopilot-related crash in China.
The Tesla in question, a Model S sedan, exchanged a bit of paint with a car stranded on the shoulder after its driver failed to react in time. The car the Tesla is following moves out of the way of the stranded motorist with more than enough time to see the giant hunk of metal blocking part of the lane. Both cars sustained minor damage.
That's it. The driver didn't react in time. Perhaps he or she thought that Autopilot would take care of it, despite an owner's manual warning that Autopilot's systems may not recognize stationary vehicles. The over-reliance on the system to react is what caused a similar crash in Europe earlier this year.
To put it simply, Autopilot is not a replacement for the driver. It makes driving in traffic more convenient, sure, but the human should always be ready to react at a moment's notice. Autopilot's goal isn't to give you more free time to catch Pokemon or read the newspaper.
Crashes like this typically come with a great deal of media hype, which can stunt the development of semi-autonomous systems as people freak out and think they're dangerous. The only dangerous thing here is an inattentive driver.
(Hat tip to Car News China!)