A small group of private citizens, no more than a few thousand at most, have already been testing the system for
nearly a year, and the videos they’ve posted on social media of Teslas attempting to drive themselves through traffic have received both cautious praise and derision. Tesla has not released information on exactly
how many drivers, who they were, or how they were selected to be in the first “full self-driving” public beta test. Tesla
has been criticized for not getting consent from the pedestrians, cyclists and other drivers who share the street with the cars testing “full self-driving.”
Tesla did not respond to a request for comment and generally does not engage with the professional news media.
Tesla drivers will have to make a privacy trade. Drivers who want early access to the technology must agree to allow Tesla to collect data on their driving style, and judge it.
Here’s a rundown of common questions about the technology:
What is “full self-driving?”
Tesla claimed in
2016 that all of its new vehicles had the hardware capability for “full self-driving” and that it would soon offer the complementary software to make the cars drive themselves.
Musk has said that he thinks people will be able to
fall asleep in Teslas as they drive. He’s spoken of a future that includes a million
robotaxis and Teslas driving themselves
across the country.
But the available version of “full self-driving” is a far cry from those ambitious claims and it requires drivers to remain vigilant. Drivers who have used early versions of “full self-driving” have had to intervene to prevent their cars from crashing into things or driving on the wrong side of the road. Sometimes the drivers have praised the technology, other times they criticize it as no better than a
drunk driver. Many have said the technology appears to be generally improving over time. Even so, “full self-driving” may handle a situation perfectly, but then
fail the next time it faces the same situation.
Tesla is rolling out access to “full self-driving” as its customers have
grown frustrated and tired of waiting years for the technology. Some are increasingly skeptical of Musk’s claims.
Even Tesla has drivers signing up for the technology acknowledge that “FSD Beta does not make my car autonomous.”
Most autonomous vehicle experts believe that full self-driving means a car in which a person could safely fall asleep behind the wheel, and no attentive human driver is needed. Regulators have
repeatedly criticized Tesla’s use of the term “full self-driving.”
So far their actions have been more bark than bite. The National Highway Traffic Safety Administration has said repeatedly in statements that there are no vehicles available for sale that can drive themselves. But driver-assist systems like Autopilot and “full self-driving” are not currently regulated, so Tesla and other automakers can deploy whatever driver-assist technology they want. There are signals this may change.
NHTSA has launched an
investigation into Teslas on Autopilot rear-ending emergency vehicles stopped in the roadway. The administration has also requested extensive data from automakers on their driver-assist system, and the tough talk has continued.
“Tesla is putting untrained drivers on public roads as testers for their misleadingly-named, unproven system—a seeming recipe for disaster,” U.S. Senator Richard Blumenthal (D-CT.) said Sunday. “Serious safety concerns should put this reckless plan in reverse. It’s Russian Roulette for unsuspecting drivers & the public.”
Blumenthal has called for the Federal Trade Commission to investigate Tesla’s autonomous driving features,
and
cheered the NHTSA investigation.
How is Tesla deciding what drivers will get access to “full self-driving?”
Tesla
announced Saturday
a “safety score,” which it says will estimate the likelihood that a driver could be in a collision. The safety score will track hard braking, aggressive turning, tailgating, forward collision warnings and Autopilot disengagements, according to
Tesla. (
Autopilot generally refers to Tesla’s more rudimentary
suite of driver assist features like traffic-aware cruise control.)
Musk has said that drivers will be granted access to the “full self-driving” beta if their driving is “good” for seven days.
People who have shared Tesla safety scores on social media so far have offered a range of reviews. Many
welcomed and
embraced the scores. Some have expressed
surprise at how high their score was given their driving style, while others have said the
score seemed
lower than expected. Some have
described driving in a way that games the system to improve their score, but isn’t actually typical behavior of a safe driver.
One Tesla owner
said he had achieved a score of 95 out of 100 after running yellow lights, not braking for a cyclist, and rolling through stop signs.
Musk has
said that the safety score “will evolve over time to more accurately predict crash probability.”
Not everyone gets access yet
Tesla owners who have an older version of Tesla’s touchscreen computer in their vehicles have described on social media and to CNN Business that they do not have the chance to sign up for “full self-driving.”
Tesla owners with early model vehicles, before the “full self-driving” hardware update in 2016, also generally can not get access. Tesla owners outside the United States have also described on social media not having access to request “full self-driving.”
It is unclear, however, how many in total may or may not receive the option to request access to the “full self-driving” beta software. Tesla does not release information on how many drivers have purchased the option, nor has it released information on precisely when, how, or how many drivers will be able to oversee their car attempting to drive itself.