By Darrell Etherington for TechCrunch.
Uber’s self-driving cars are making the move to San Francisco, in a new expansion of its pilot project with autonomous vehicles that will see Volvo SUVs outfitted with sensors and supercomputers begin picking up passengers in the city.
The autonomous cars won’t operate completely driverless, for the time being – as in Pittsburgh, where Uber launched self-driving Ford Focus vehicles this fall, each SUV will have a safety driver and Uber test engineer onboard to handle manual driving when needed and monitor progress with the tests. But the cars will still be picking up ordinary passengers – any customers who request uberX using the standard consumer-facing mobile app are eligible for a ride in one of the new XC90s operated by Uber’s Advanced Technologies Group (ATG).
Third generation autonomy
There’s a difference here beyond the geography; this is the third generation of Uber’s autonomous vehicle, which is distinct from the second-generation Fords that were used in the Pittsburgh pilot. Uber has a more direct relationship with Volvo in turning its new XC90s into cars with autonomous capabilities; the Fords were essentially purchased stock off the line, while Uber’s partnership with Volvo means it can do more in terms of integrating its own sensor array into the ones available on board the vehicle already.
Uber ATG Head of Product Matt Sweeney told me in an interview that this third-generation vehicle actually uses fewer sensors than the Fords that are on the roads in Pittsburgh, though the loadout still includes a full complement of traditional optical cameras, radar, LiDAR and ultrasonic detectors. He said that fewer sensors are required in part because of the lessons learned from the Pittsburgh rollout, and from their work studying previous generation vehicles; with autonomy, you typically start by throwing everything you can think of at the problem, and then you narrow based on what’s specifically useful, and what turns out not to be so necessary. Still, the fused image of the world that results from data gathered from the Volvo’s sensor suite does not lack for detail.
“You combine [images and LiDAR] together you end up with an image which you know very explicitly distance information about, so it’s like this beautiful object that you can detect as you’re moving through,” Sweeney explained to me. “And with some of the better engineered integration here, we have some radars in the front and rear bumpers behind the facades.”
Those radar arrays provide more than just the ability to see even in conditions it might be difficult to do so optically, as in poor weather; Sweeney notes that the radar units they’re using can actually bounce signal off the surface of the road, underneath or around vehicles in front, in order to look for and report back information on potential accidents or hazards not immediately in front of the autonomous Uber itself.
What Volvo brings
“The car is one of the reasons we’re really excited about this partnership, it’s a really tremendous vehicle,” Sweeney said. “It’s Volvo’s new SPA, the scalable platform architecture – the first car on their brand new, built from the ground up vehicle architecture, so you get all new mechanical, all new electrical, all new compute.”
Uber didn’t pick a partner blindly – Sweeney says they found a company with a reputation for nearly a hundred years of solid engineering, manufacturing and a commitment to iterating improvement in those areas.
“The vehicle that we’re building on top of, we’re very intentional about it,” Sweeney said, noting that cars like this one are engineered specifically for safety, which is not the main failure point when it comes to most automobile accidents today – that role is reserved for the human drivers behind the wheel.
Uber’s contributions are mainly in the sensor pod, and in the compute stack in the trunk, which takes up about half the surface area of the storage space and which Sweeney said is “a blade architecture, a whole bunch of CPUs and GPUs that we can swap out under there,” though he wouldn’t speak to who’s supplying those components specifically. The tremendous computing power it represents taken together is the key identifying objects, doing so in higher volume, and doing better pathfinding in complex city street environments.
The vehicle we’re building on top of, we’re very intentional about it.
The ride
For the actual rider, there’s an iPad-based interactive display in the rear of the vehicle, which takes over for the mobile app once you’ve actually entered the vehicle and are ready to start your ride. The display guides you through the steps of starting your trip, including ensuring your seat belt is fastened, checking your destination and then setting off on the ride itself.
During our demo, the act of actually leaving the curb and merging into traffic was handled by the safety driver on board, but in eventual full deployment of these cars the vehicles will handle even that tricky task. The iPad shows you when you’re in active self-driving mode, and also when it’s been disengaged and steering is being handled by the actual person behind the wheel instead. The screen also shows you a simplified version of what the autonomous car itself “sees,” displaying on a white background color-coded point- and line-based rudimentary versions of the objects and the world surrounding the vehicle. Objects in motion display trails as they move through this real-time virtual world.
The iPad-based display also lets you take a selfie and share the image from your ride, which definitely helps Uber promote its efforts, while also helping with the other key goal that the iPad itself seeks to achieve – making riders feel like this tech is both knowable and normal. Public perception remains one of autonomous driving’s highest bars to overcome, along with the tech problem and regulation, and selfies are one seemingly shallow way to legitimately address that.
So how did I feel during my ride? About as excited as I typically feel during any Uber ride, after the initial thrill wore off – which is to say mostly bored. The vehicle I was in had to negotiate some heavy traffic, a lot of construction and very unpredictable south-of-Market San Francisco drivers, and as such did disengage with fair frequency. but it also handled long open stretches of road at speed with aplomb, and kept distance in more dense traffic well in stop-and-go situations. It felt overall like a system that is making good progress in terms of learning – but one that also still has a long way to go before it can do without its human minders up front.
My companion for the ride in the backseat was Uber Chief of Watch Rachel Maran, who has been a driver in Uber’s self-driving pilot in Pittsburgh previously. She explained that the unpredictability and variety in any new driving environment is going to be one of the biggest challenges Uber’s autonomous driving systems have to overcome.
Seeking a safety step-change
Uber’s pilot in San Francisco will be limited to the downtown area to start, and will involve “a handful” of vehicles to start, with the intent of ramping up from there according to the company. The autonomous vehicles in Pittsburgh will also continue to run concurrently with the San Francisco deployment. Where Pittsburgh offers a range of weather conditions and other environmental variables for testing, San Francisco will provide new challenges for Uber’s self-driving tech, including denser, often more chaotic traffic, plus narrower lanes and roads.
Safety’s super important – that’s why we’re doing this.
The company doesn’t require a permit from the California DMV to operate in the state, it says, because the cars don’t qualify as fully autonomous as defined by state law because of the always present onboard safety operator. Legally, it’s more akin to a Tesla with Autopilot than to a self-driving Waymo car, under current regulatory rules.
Ultimately, the goal for Uber in autonomy is to create safer roads, according to Sweeney, while at the same time improving urban planning and space problems stemming from a vehicle ownership model that sees most cars sitting idle and unused somewhere near 95 percent of the time. I asked Sweeney about concerns from drivers and members of the public who can be very vocal about autonomous tech’s safety on real roads.
“This car has got centimeter-level distance measurements 360-degrees around the vehicle constantly, 20 meters front and 20 meters back constantly,” Sweeney said, noting that even though the autonomous decision-making remains “a really big challenge,” the advances achieved by the sensors themselves and “their continuous attention and superhuman perception […] sets us up for the first really marked decrease in automotive fatalities since the airbag.”
“I think this is where we really push it down to zero,” Sweeney added. “People treat it as though it’s a fact of life; it’s only because we’re used to it. We can do way better than this.”
Be the first to comment on "Uber’s self-driving cars start picking up passengers in San Francisco"