It’s a sight that would startle passing motorists – a 70,000-pound Freightliner Inspiration heavy-duty truck rolls down a Nevada interstate as the driver plays with a laptop computer, ignoring the road.

While Daimler Trucks is only testing the Freightliner self-driving truck on Nevada roads now, this could become a common sight over the next 20 years.

Autonomous truck sales could reach 60,000 annually by 2035, according to IHS Automotive, an industry research firm. That would amount to 15 percent of sales for trucks in the big Class 8 weight segment, IHS forecasters said.

Automotive manufacturers are investing billions of dollars in self-driving vehicle technology, though most of the attention has been focused on the passenger-car side of the transportation industry. But a recent study from consulting firm Roland Berger said autonomous technology will also provide significant benefits — such as labor and fuel savings — to the heavy-duty trucking industry.

That’s something that truck manufacturers like Daimler are exploring — but industry officials are quick to warn that putting self-driving vehicles on the road, whether passenger cars or fully loaded semis, still requires overcoming significant technological hurdles. And that’s before addressing the social and legal challenges.

“There won’t be one giant leap forward,” said Derek Rotz, director of advanced engineering for Freightliner’s U.S. parent, Daimler Trucks North America. “There will be incremental steps.”

The “base technology” being used for the Freightliner Inspiration prototype is “largely an adaptation” of what Germany’s Daimler AG is also using in its passenger car models like the newly redesigned 2017 Mercedes-Benz E-Class, Rotz said.

The automaker bills the new E300 sedan as “the most intelligent car in the world,” and it introduces new autonomous features such as Lane Change Assist. When a driver activates the radar-guided smart cruise-control system and taps the turn signal, the E300 will pass a slower car without the driver’s intervention. But Mercedes has limited the amount of time the sedan will operate hands-free to just 30 seconds – and then only on limited-access highways.

Tesla Motors, meanwhile, had to dial back the semi-autonomous AutoPilot system it launched late last year for its Models S and X when it became clear the technology couldn’t handle all the situations it might run into. It could be confused, for example, by highway lane markers leading to an exit ramp.

“We’re a long way from the finish line,” said Gill Pratt, an autonomous vehicle researcher and chief executive of the Toyota Research Institute, or TRI.

Until recently, Toyota was skeptical of autonomous technology. But as its automotive rivals and the National Highway Traffic Safety Administration pushed further into self-driving vehicle research, it launched TRI with $1 billion in funding. The Japanese automaker recruited Pratt, a roboticist and former specialist on self-driving systems with the Defense Advanced Projects Research Agency, to head the research arm.

Like Freightliner’s Rotz, Pratt said the road to self-driving vehicles is paved with incremental steps. There are many obstacles ahead. For example, even the most sophisticated vehicle sensors today find it difficult to tell the difference between sand, snow and ice, Pratt said. Although each of those surfaces can be slippery, they must be handled quite differently. Most human drivers understand how to navigate difficult road surfaces, but they present a technical challenge for a robotic driver.

One of the biggest obstacles to autonomous driving will be teaching the artificial intelligence systems to understand how human drivers think.

That has presented a problem for Google’s self-driving car program. Its autonomous vehicles have been involved in nearly 20 crashes over the last six years. All but one were blamed on the other – human –  driver. But the Google prototypes may have contributed to some by obeying the law in ways human drivers don’t.

At stoplights, for example, the vehicles have been taught to stop immediately if the light turns yellow. Yet in many situations — such as heavy traffic in urban regions of California — motorists will often make a left turn at the last moment. Expecting the Google cars to do the same thing, they’ve rammed the prototypes on a number of occasions. Meanwhile, the one crash for which Google admits responsibility came when a prototype autonomous car misunderstood how a bus driver would handle merging traffic.

Google recently inked a deal with Fiat Chrysler Automobiles to test its technology in 100 new Chrysler Pacifica plug-in hybrid minivans. The deal was significant because it moves the company’s technology closer to production. It also gives Google an opportunity to test autonomous vehicles in snow and other cold weather situations at a new suburban Detroit tech hub.  Up until now, Google has limited testing to the warm, clear roads around its headquarters in Silicon Valley and its technical center in Austin, Texas.

Other automakers and even the U.S. Army are sorting through how to get autonomous vehicle technology on the road quickly.

Nissan hopes to have as many as 10 autonomous vehicles in production by 2020. But Tetsuo Ijima, chief engineer of the Nissan autonomous vehicle program, said that is an ambitious goal.

Handling a freeway is relatively easy, “but we will have to learn how to handle every possible situation a human driver might experience,” Ijima said during a drive through Tokyo in a Nissan autonomous prototype.

If anything, experts caution, people will have less tolerance for mistakes by an autonomous vehicle. To better understand those challenges and determine how to deal with them, Nissan put together a team that includes a former NASA rocket scientist, an anthropologist and a sociologist. Among their challenges: figuring out how to let autonomous vehicles share the road with humans, whether they’re driving, walking or riding a bicycle.

Humans often rely on facial expressions and gestures, especially in urban situations such as four-way stops, or when encountering a pedestrian at a crosswalk. What to do in those circumstances? Some researchers believe tomorrow’s autonomous vehicles might need facial recognition software and signal lights to replace a human’s nod or wave.

It is all part of what the industry calls “granularity.”

Imagine coming in for a landing in a helicopter. At 1,000 feet, the pavement of a parking lot will look almost as smooth as glass. Get a little closer and the potholes are noticeable. Down on the ground, every pebble is noticeable.

Most of the time, driving is easy. But it’s the encounters with little pebbles that humans are best at coping with. Until autonomous vehicles can recognize and respond to the same set of challenges humans face routinely as they drive from Point A to Point B, this new technology simply won’t be ready to take over the driver’s seat.