Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Your Tesla Can Now Come Get You. In California, That Is Not ‘Driverless.’

The new app-based feature demonstrates the latest policy lag in the wild world of autonomous vehicles.

(TNS) — Tesla unleashed the latest twist in driverless car technology last week, raising more questions about whether autonomous vehicles are outracing public officials and safety regulators.

The Palo Alto electric car company on Sept. 26 beamed a software feature called Smart Summon to Tesla owners who prepaid for it. Using a smartphone, a person can now command a Tesla to turn itself on, back out of its parking space and drive to the smartphone holder’s location — say, at the curb in front of a Costco store.

The car relies on onboard sensors and computers to help it move forward, back up, steer, accelerate and decelerate on its own, braking if it detects people, other vehicles or stationary objects in its path. The “driver” must keep a finger or thumb on the smartphone screen or the car will stop.

Tesla recommends the feature for parking lots, and the technology’s range — 200 feet — limits its applications. But in theory, a car can be summoned anywhere — to drive down a public street, for instance. Sure enough, videos quickly sprouted of Tesla owners doing just that, and more.

Is it legal? Yes, according to the California Department of Motor Vehicles. And even though the state has safety requirements that must be met before companies can deploy driverless cars, Tesla’s latest service doesn’t need a permit.

That’s because the DMV determined that the combination of Smart Summon and the cars’ robot systems doesn’t count as “autonomous technology.” The department’s rationale is the car is “under the control” of the person holding the smartphone.

The new director of the DMV, Steve Gordon — a longtime Silicon Valley executive — declined to be interviewed for this story.

Some safety officials, however, worry that Smart Summon hasn’t been thoroughly tested and may be marketed in ways that confuse users. The National Safety Council, a nonprofit health and safety advocacy group, has expressed concern about the rush to deploy full driverless technology by Tesla and other companies.

Kelly Nantel, an NSC vice president, issued a statement on Smart Summon:

“In introducing any new advanced safety feature, it is important for manufacturers to ensure that the feature is extensively tested and mature, and that the role of the driver in controlling the vehicle is crystal clear. Failing in either of these responsibilities risks creating confusion that can put road users at risk and reduce public trust in the potential of automated vehicles.”

Meanwhile, the National Highway Traffic Safety Administration is aware of Smart Summon and is in contact with Tesla. The agency said in an email it “will not hesitate to act” if it finds evidence of a safety-related defect.

A week after Smart Summon was issued, no injuries involving the technology have been reported, and no government has barred its use.

State laws on driverless cars vary dramatically. States such as Florida, Michigan and Arizona are more permissive than California, and some states have no driverless car laws at all.

In California, the only company that holds a permit for testing autonomous cars without a human inside is Waymo, the driverless car offshoot of Google. Permits have been granted to 63 companies, including Tesla, to test autonomous cars with a human on board.

Smart Summon works over Wi-Fi and cellphone networks through Tesla’s smartphone app, which also provides remote locking, unlocking and other features. Although Smart Summon’s range is limited to 200 feet, Tesla on its website promises to add traffic-light and stop-sign recognition and automatic driving on city streets by Dec. 31, but it hasn’t offered further details. Tesla did not respond to requests for comment.

In its marketing materials, Tesla says that with Smart Summon, “Your parked car will come find you anywhere in a parking lot. Really.” In smaller print, the company says, “The currently enabled features require active driver supervision and do not make the vehicle autonomous.”

When Tesla owners download the software, they get a “what’s new” message on their infotainment screen telling them they must remain responsible for the car and monitor it at all times.

Minutes after Smart Summon’s release, user videos began appearing on Facebook, Twitter and YouTube.

Many focus on the wow factor, showing dogs, kids and even a Halloween skeleton behind the wheel of the moving car. Some demonstrate the technology’s limitations or near collisions, others show users blatantly ignoring Tesla’s warnings.

Several videos show Smart Summon driving cars on public roads. Users are supposed to have the vehicle always within view, but at least in some cases, the feature appears to work whether they can see the car or not.

One user posted a video of himself playing catch-me-if-you-can by running away from a Tesla in a parking lot, while the car struggles to keep up. Another video shows a user testing the technology by directing his daughter and her dog to walk in front of an oncoming Tesla, apparently to see if it would stop. (It did.)

The novelty of a car driving by itself was made clear in a video that shows a man in a parking lot running toward a Tesla’s passenger door, apparently believing it was a runaway vehicle. “Nobody’s inside,” the man says. The smartphone holder fills him in on the situation.

Teslas already came equipped with Autopilot, a driver-assist feature that enables the car to steer itself and pass other vehicles on highways. For years Autopilot has drawn fire from critics who point to videos of people sleeping, eating or reading. Tesla tells drivers to pay attention and keep their hands on the wheel, but CEO Elon Musk himself has been shown on YouTube and national television driving on Autopilot with his hands raised.

A car that drives without someone behind the wheel raises questions of legal liability. In some accidents involving Tesla’s Autopilot, the company has blamed drivers, saying they didn’t follow instructions in the manual. Several lawsuits have been filed against Tesla over the Autopilot feature.

Insurance coverage may be an issue. If a Tesla in Smart Summon mode hits another car, or injures a pedestrian, will the driver’s insurance policy cover the costs? The Times put that question to Geico, Farmers, Nationwide, Progressive, Hartford, Allstate, SafeAuto and Travelers. Only Travelers responded, saying it had no comment.

Asked to explain Smart Summon’s treatment under California’s driverless car rules, DMV spokesman Marty Greenfield quoted state regulations that govern “autonomous technology” that define it as “technology that has the capability to drive a vehicle without the active physical control or monitoring by a human operator.” Because Smart Summon is controlled by a smartphone, he said, it doesn’t count as autonomous.

But that very aspect, the need for a remote controller, raises concerns about reliability for one technical expert working for the insurance industry.

“One must suspect that the system is not reliably safe or the need for human supervision wouldn’t be necessary,” David Zuby, head of vehicle research for the Insurance Institute for Highway Safety, said about Smart Summon.

“The implied unreliability is the ‘troubling’ aspect of this feature because there’s already evidence that some people will not monitor the vehicle’s progress,” he said. “If it needs supervision, then it’s irresponsible to give the drivers the opportunity to be remote from the vehicle when it’s moving.”

Tesla may have more incentive to deploy autonomous features than other companies. It’s unprofitable and has been surviving for 16 years on new rounds of investor cash. Waymo, owned by Google’s parent company, Alphabet Inc., has plenty of cash and can afford to take its time. Major automakers aren’t as flush, but they are profitable with tremendous cash flow that allows for some patience too.

Public officials, meanwhile, may be left playing catch-up.

“The law is flexible, and a lot of this comes down to how does government feel about these technologies, and equally important, how does government feel about the company behind them,” said Bryant Walker Smith, a law professor at the University of South Carolina and one of the world’s leading experts on driverless vehicle law.

“If they’re not receptive, any regulator can find language they could use to shut this down,” he said. “If they are receptive, then at least until there’s a crash [companies] will have a lot of flexibility in terms of what they’re able to do.”

©2019 the Los Angeles Times. Distributed by Tribune Content Agency, LLC.