Consumer Reports's Advocacy division is making new demands related to Tesla's Autopilot system.

The Consumers Union (CU) group (a division of Consumer Reports) has called Tesla out for its Autopilot system, obviously due to the recent fatal Model X crash and related media coverage. Tesla has been asked to improve the system, as well as to release a new statement explaining its claims that Autopilot is the "world's safest" system. The Union wants more public data supporting such claims.
Tesla Told To Improve Autopilot, Release Claimed

Related: Tesla Fires Back: NTSB Removes Tesla From Investigation Into Deadly Model X Crash

Tesla has admitted that Autopilot was engaged during the deadly crash, and this was also the case in an earlier fatal incident in Florida. However, the automaker believes that the drivers should have been paying attention.

According to CU, Autopilot should limit its use to areas in which it can be used successfully. It believes that the safety system is able to be activated when it's not necessarily safe to use. Additionally, it's concerned that Tesla's "hands-on" warning isn't enough. Director of Cars and Product Policy and Analysis for Consumers Union David Friedman explained:

After another tragedy involving Autopilot, Tesla should commit to put safety first—and to stop using consumers as beta testers for unproven technology. While the results of the crash investigations will be critical to understanding all that contributed to this tragedy, previous NTSB findings already showed that Autopilot should do more to protect consumers. We see no excuse: Tesla should improve the safety of Autopilot without delay.

Tesla markets itself as an innovator. It should not put lives at risk, damage its reputation, or risk the success of its systems—or driver assist technology as a whole—by failing to take steps that would better protect consumers’ safety. Further, the company should not make either specific or broad safety claims without providing the detailed data to back them up. They should show, not just tell, us how safe their system is.

Instead of issuing a defensive Friday evening blog post or statements blaming the victim, Tesla should fix Autopilot’s design and be transparent about their safety claims. The company should publicly provide detailed data to demonstrate conditions for which its Autopilot system can safely operate. It should limit Autopilot’s operation only to those conditions, and have a far more effective system to sense, verify, and safely react when the human driver’s level of engagement in the driving task is insufficient or when the driver fails to react to warnings. If other companies can do it, Tesla should as well. Further, this would fulfill the NTSB recommendations made more than six months ago.

Consumer Reports and Consumers Union have asked automakers to do a better job of making sure drivers are aware of each systems' limits, as well as assuring that there is some backup system in place in case a driver overestimates the technologies' weaknesses. In regards to Tesla, the organizations have already requested that the Autopilot system should cease to operate in certain situations.

Consumers Union's recent article explains:

In addition, Consumers Union urged the U.S. Senate and NHTSA to take action in response to the NTSB’s September 2017 recommendations and require critical safeguards in vehicles with partially or conditionally automated driving technologies. The NTSB’s recommendations included that the Department of Transportation and NHTSA should develop and issue mandatory performance standards for these systems and ensure better collection of crash data. The NTSB also recommended that manufacturers should limit (and NHTSA should verify that they have limited) the use of automated driving systems to appropriate circumstances and develop systems to more effectively sense a human driver’s level of engagement and alert the driver when automated driving systems are in use and the driver is inattentive.

Source: ConsumersUnion