Uber spent $457 million last year on research and development of autonomous vehicles, flying cars (known as eVTOLs) and other “technology programs” and will continue to invest heavily in the futuristic tech even though it expects to rely on human drivers for years to come, according to the company’s IPO prospectus filed Thursday.
R&D costs at Uber ATG, the company’s autonomous vehicle unit, its eVTOL unit Uber Elevate and other related technology represented one-third of its total R&D spend. Uber’s total R&D costs in 2018 were more than $1.5 billion.
Uber filed its S-1 on Thursday, laying the groundwork for the transportation company to go public next month. This comes less than one month after competitor Lyft’s debut on the public market. Uber is listing under the New York Stock Exchange under the symbol “UBER,” but has yet to disclose the anticipated initial public offering price.
Uber believes that autonomous vehicles will be an important part of its offerings over the long term, namely that AVs can increase safety, make rides more efficient and lower prices for customers.
However, the transportation company struck a more conservative tone in the prospectus on how and when autonomous vehicles will be deployed, a striking difference from the early days of Uber ATG when former CEO Travis Kalanick called AVs an existential risk to the business.
Uber contends there will be a long period of “hybrid autonomy” and it will continue to rely on human drivers for its core business for the foreseeable future. Uber said even when autonomous vehicle taxis are deployed, it will still need human drivers for situations that “involve substantial traffic, complex routes, or unusual weather conditions.” Human drivers will also be needed during concerts, sporting events and other high-demand events that will “likely exceed the capacity of a highly utilized, fully autonomous vehicle fleet,” the company wrote in the S-1.
Here’s an excerpt from the S-1:
Along the way to a potential future autonomous vehicle world, we believe that there will be a long period of hybrid autonomy, in which autonomous vehicles will be deployed gradually against specific use cases while Drivers continue to serve most consumer demand. As we solve specific autonomous use cases, we will deploy autonomous vehicles against them. Such situations may include trips along a standard, well-mapped route in a predictable environment in good weather.
Uber contends it is well-suited to balance that potentially awkward in-between phase when both human drivers and autonomous vehicles will co-exist on its platform.
“Drivers are therefore a critical and differentiating advantage for us and will continue to be our valued partners for the long-term,” Uber wrote.
Despite Uber’s forecast and more tempered tone, the company is pushing ahead on autonomous vehicles.
Uber ATG was founded in 2015 in Pittsburgh with just 40 researchers from Carnegie Robotics and Carnegie Mellon University . Today, Uber ATG has more than 1,000 employees spread out in offices in Pittsburgh, San Francisco and Toronto.
Uber acknowledged under the risk factors section of the S-1 that it could fail to develop and successfully commercialize autonomous vehicle technologies or could be undercut by competitors, which would threaten its ride-hailing and delivery businesses.
Uber’s view of which companies pose the biggest threat to the company was particularly interesting. The company named nearly a dozen potential competitors, a list that contained a few of the usual suspects like Waymo, GM Cruise and Zoox, as well as less-known startups such as May Mobility and Anthony Levandowski’s new company, Prontio.ai. Other competitors listed in the S-1 include Tesla, Apple, Aptiv, Aurora and Nuro. Argo AI, the subsidiary of Ford, was not listed.
ATG has built more than 250 self-driving vehicles and has three partnerships — Volvo, Toyota and Daimler — that illustrates the company’s mult-tiered strategy to AVs.
Uber has a first-party agreement with Volvo. Under the agreement announced in August 2016, Uber owns Volvo vehicles, has added its AV tech and plans to deploy those cars on its own network.
Its partnership with Daimler is on the other extreme. In that partnership, announced in January 2017, Daimler will introduce a fleet of its own AVs on the Uber network. This is similar to Lyft’s partnership with Aptiv.
Finally, there’s Toyota, a new partnership just announced in August 2018, that is a hybrid of sorts of the other two. Uber says it expects to integrate its autonomous vehicle technologies into purpose-built Toyota vehicles to be deployed on its network.
Painting the sky with your own aurora is as easy as launching a NASA rocket into the Norwegian night.
The space agency orchestrated its own version of the light phenomena on Friday in order to understand the amount of energy auroras generate within Earth’s upper atmosphere and beyond.
NASA launched two sounding rockets from Norway’s Andøya Space Center, to generate artificial auroras against the backdrop of a real aurora. And it looks pretty magical:
The process is (somewhat uncomfortably) known as “auroral forcing,” and is the main focus of the NASA-funded Auroral Zone Upwelling Rocket Experiment (AZURE).
The project will see eight rocket missions launch over the next two years from Norway’s Andøya and Svalbard rocket ranges.
“The more we learn about auroras, the more we understand about the fundamental processes that drive near-Earth space — a region that is increasingly part of the human domain, home not only to astronauts but also communications and GPS signals that can affect those of us on the ground on a daily basis,” NASA’s blog post reads.
The sounding rockets, launched two minutes apart and reaching an altitude of 320 kilometres (198 miles) took measurements of the atmospheric density and temperature.
When the timing was just right, deployed both trimethyl aluminum (TMA) and a barium/strontium mixture, which ionises when exposed to sunlight.
NASA Sounding Rockets Program Office and ASC launched two sounding rockets in the AZURE project tonight at 2214 UTC. The two vehicles were launched two minutes apart, reaching 320 km altitude while releasing a visible gas to investigate conditions inside the aurora borealis. pic.twitter.com/nrqHJt1Hfx
These colourful clouds work as tracers, so researchers can study the vertical winds in the region — these winds create what NASA refers to as a “particle soup” that redistributes the atmosphere’s energy, chemical constituents, and momentum.
NASA said the chemicals released “pose no hazard to residents in the region.”
Projects like Google’s Waymo, Uber, Cruise and Aurora are developing autonomous vehicles by throwing engineers at the problem, basing most of their platforms on rule-based systems that try to pre-empt and deal with every edge case, whilst also peppering the cars with more sensors to capture more data. This can work in relatively controlled environments but has the drawback of not being able to flexibly adapt in real-time to fast-changing situations.
Despite all this investment and many years of development, no one has yet been able to launch a commercial autonomous car service. It’s just very hard to hand-engineer. What’s required is not more eyes but better coordination. The simple answer — as it is to almost everything these days — would be to throw AI at the problem, and that’s what many startups, which lack the engineering and hardware muscle of the big players, are trying to do.
We reported on UK start-up Wayve last year when it announced its existence, but they had nothing to show for their claims.
Now they say they do, and the results are not only fascinating but might also be genuinely innovative.
In fact, they are claiming a “world first” in demonstrating that a car working on their machine-learning platform can drive on roads it’s never seen before during training, and without an HD map of its environment. Other systems, like Waymo’s, rely on maps and rules to drive. Theirs, says Wayve, does not.
Additionally, it’s also revealed that it’s been testing its platform on the Jaguar I-PACE SUV, which won the 2019 European Car of the Year. Interestingly, this is also a car which has been used by Waymo in some tests.
Alex Kendall, Co-Founder & CTO tells me: “Our cars learn to drive from data with machine learning. Every time a safety driver intervenes and takes over, the car learns to drive better. We don’t tell the car how to drive, rather it learns to drive from experience, example and feedback, just like a human. This is more safe and scalable than any other approach today.”
Emerging out of research from the University of Cambridge, Wayve has already undertaken extensive testing on public UK roads.
Kendall says: “The traditional approach used by all our competitors relies on HD-maps, expensive sensor suites and hand-coded rules that tell the car how to drive. We have built a system that learns end-to-end with machine learning. It is the first in the world to drive on urban roads it has never been on before. It uses compute/sensors which cost less than 10% of competitors.”
Bold claims. But a video today revealed on their site shows their system driving on public roads in Cambridge, UK, driving on roads it has never been on before using a sat-nav route map and basic cameras.
“We don’t tell the car how to drive with hand-coded rules: everything is learned from data. This allows us to navigate complex, narrow urban European streets for the first time. End-to-end deep learning,” says Kendall.
“Our model learns both lateral and longitudinal control (steering and acceleration) of the vehicle with end-to-end deep learning. We propagate uncertainty throughout the model. This allows us to learn features from the input data which are most relevant for control, making computation very efficient. In fact, everything operates on the equivalent of a modern laptop computer. This massively reduces our sensor & compute cost (and power requirements) to less than 10% of traditional approaches,” he says.
Assuming other independent observers can confirm these claims, it looks like a UK startup just leap-frogged the entire autonomous car space.
For now Wayve is being coy about it’s investors, saying only that Professor Zoubin Ghahramani, Chief Scientist of Uber, is an investor.
Welcome back to Transportation Weekly; I’m your host Kirsten Korosec, senior transportation reporter at TechCrunch. We love the reader feedback. Keep it coming.
Never heard of TechCrunch’s Transportation Weekly? Read the first edition here. As I’ve written before, consider this a soft launch. Follow me on Twitter @kirstenkorosec to ensure you see it each week. An email subscription is coming!
This week, we’re shoving as much transportation news, tidbits and insights in here as possible in hopes that it will satiate you through the end of the month. That’s right, TechCrunch’s mobility team is on vacation next week.
You can expect to learn about metamaterials, how traffic is creating genetic peril, the rise of scooter docks in a dockless world, new details on autonomous delivery startup Nuro and a look back at the first self-driving car fatality.
There are OEMs in the automotive world. And here, (wait for it) there are ONMs — original news manufacturers. (Cymbal clash!) This is where investigative reporting, enterprise pieces and analysis on transportation lives.
Mark Harris is here again with an insider look into autonomous vehicle delivery bot startup Nuro. The 3-year-old company recently announced that it raised $940 million in financing from the SoftBank Vision Fund.
Harris, during his typical gumshoeing, uncovers what Nuro might do with all that capital. It’s more than just “scaling up” and “hiring talent” — the go-to declarations from startups flush with venture funding. No, Nuro’s founders have some grand ideas from automated kitchens and autonomous latte delivery to smaller robots that can cross lawns or climb stairs to drop off packages. Nuro recently told the National Highway Traffic Safety Administration that it wants introduce up to 5,000 upgraded vehicles called the R2X, over the next two years.
Let us explain. Most autonomous vehicles, robots and drones use lidar (or light detection and ranging radar) to sense their surroundings. Lidar basically works by bouncing light off the environment and measuring how and when it returns; in short, lidar helps create a 3D map of the world. (Here’s a complete primer on WTF is Lidar).
However, there are limitations to lidar sensors, which rely on mechanical platforms to move the laser emitter or mirror. That’s where metamaterials come in. In simple terms, metamaterials are specially engineered surfaces that have embedded microscopic structures and work as a single device. Metamaterials remove the mechanical piece of the problem, and allow lidar to scan when and where it wants within its field of view.
Metamaterials delivers the whole package: they’re durable and compact, solve problems with existing lidar systems, and are not prohibitively expensive.
If they’re so great why isn’t everyone using them? For one, it’s a new and emerging technology. Lumotive’s product is just a prototype. And Intellectual Ventures (IV) holds the patents for known techniques, Coldewey recently explained to me. IV is granting Lumotive an exclusive license to the tech — something it has done with other metamaterial-based startups it has spun out.
Shifting gears to Volvo
Automakers are rolling out increasingly robust advanced driver assistance systems in production cars. These new levels of automation are creating a conflict of sorts. One on hand, features like adaptive cruise control and lane steering can make commutes less stressful and arguably safer. And yet, they can also cause overconfidence in the system and complacency among drivers. (Even Tesla CEO Elon Musk has noted that complacency is a problem among owners using its advanced ADAS feature called Autopilot). (And yes, I wrote advanced ADAS; it sounds repetitive, but it’s meant to express higher levels of automation and a term I recently encountered from two respected sources)
Some argue that automakers shouldn’t deploy these kinds of automated features unless vehicles are equipped with driver-monitoring systems (DMS are essentially an in-car camera and accompanying software) that can ensure drivers are paying attention. Volvo is taking that a step further.
Driver Monitoring Camera in a Volvo research vehicle
The company announced this week that it will integrate DMS into its next-gen, SPA2-based vehicles beginning in the early 2020s and more importantly, enable its system to take action if the driver is distracted or intoxicated. The camera and other sensors will monitor the driver and will intervene if a clearly intoxicated or distracted driver does not respond to warning signals and is risking an accident involving serious injury or death. Under this scenario, Volvo could limit the car’s speed, call the Volvo on Call service on behalf of the driver or cause the vehicle to slow down and park itself on the roadside.
Volvo’s plans raise all kinds of questions, including privacy concerns and liability. The intent is to add a layer of safety. But it also adds complexity, which could compromise Volvo’s mission. The Autonocast, a podcast I co-host with Alex Roy and Ed Niedermeyer, talk about Volvo’s plans in our latest episode. Check it out.
A little bird …
We hear a lot. But we’re not selfish. Let’s share.
Remember two weeks ago when we dug into Waymo’s laser bears and wondered whether we had reached “peak” LiDAR? (Last year, there were 28 VC deals in LiDAR technology valued at $650 million. The number of deals was slightly lower than in 2017, but the values jumped by nearly 34 percent.)
It doesn’t look like we have. We’re hearing about several funding deals in the works or recently closed, a revelation that shows investors still see opportunity in startups trying to bring the next generation of light ranging and detection sensors to market.
Spotted …. Former Zoox CEO and co-founder Tim Kentley Klay was spotted at the Self-Racing Car event at Thunderhill Raceway near Willows, Calif., this weekend.
Got a tip or overheard something in the world of transportation? Email me or send a direct message to @kirstenkorosec.
Deal of the week
Lyft set the terms for its highly-anticipated initial public offering and announced it will kick off the roadshow for its IPO. That means the initial public offering will likely occur in the next two weeks. Here’s the S-1 that Lyft filed in early March. This latest announcement also revealed new details, including that its ticker symbol will be “LYFT” — as one might expect — and that the IPO range is set for between $62 and $68 per share to sell 30,770,000 shares of Class A common stock. Lyft could raise up to $2.1 billion at the higher end of that range, or $1.9 billion at the lower end.
The Lyft news was big — and it’s a story we’ll be following for awhile. However, we wanted to highlight another one of Ingrid Lunden’sarticles because it underscores a point I’ve been pushing for awhile: not every important move in the world of autonomous vehicles occurs in the big three of Detroit, Pittsburgh and Silicon Valley.
This week, Yandex, the Russian search giant that has been working on self-driving car technology, inked a partnership with Hyundai to develop software and hardware for autonomous car systems. This is Yandex’s first partnership with an OEM. But it’s not Hyundai’s first collaboration with an autonomous vehicle startup. (Hyundai has a partnership with Aurora too)
Yandex will work with Hyundai Mobis, the car giant’s OEM parts and service division, “to create a self-driving platform that can be used by any car manufacturer or taxi fleet” that will cover both a prototype as well as parts for other car-makers.
One year ago, I parked on a small rise overlooking Mill Avenue in Tempe, Arizona. The mostly dirt knoll, dotted with some trees and a handful of structures known out here as ramadas, was hardly remarkable. Just one other car sat in the disintegrating asphalt parking lot, the result of so many sun-baked days. A group of homeless people had set up at the picnic tables under a few of the structures, their dogs lolling nearby.
And yet, it was here, or specifically on the gleaming road below, that something extraordinary had indeed happened. Just days before, Elaine Herzberg was crossing Mill Avenue south of Curry Road when an Uber self-driving vehicle struck and killed her. The vehicle was in autonomous mode at the time of the collision, with a human test driver behind the wheel.
I had been in the Phoenix area, a hub for testing autonomous vehicle technology, to moderate a panel on that very subject. But the panel had been hastily canceled by organizers worried about the optics of such a discussion. And so I picked up Starsky Robotics CEO Stefan Seltz Axmacher who was also in town for this now-canceled panel, and we drove to site where Herzberg had died.
I wrote at the time, that “March 18 changed everything—and nothing—in the frenzied and nascent world of autonomous vehicles.” One year later, those words are still correct. The incident dumped a bucket of ice water over the figurative heads of autonomous vehicle developers. Everyone it seemed, had sobered up. Testing was paused, dozens of companies assessed their own safety protocols. Earnest blogs were written. Lawsuits were filed.
And yet, the cogs on the AV machine haven’t stopped turning. That’s not necessarily a bad thing. Innovation can sometimes “make the world a better place.” But it’s rarely delivered in a neat little package, no strings attached.
I’m hardly the first to reflect or write about this one-year anniversary. There are many takes, some of them hot, others not so much. And there are a few insightful ones; Autonocast co-host Niedermeyer has one entitled 10 Lessons from Uber’s Fatal Self-Driving Car Crash that’s worth reading.
Right now, I’m more interested in those lessons that haven’t been learned yet. It’s partly what prompted us to launch this newsletter, a weekly post that aims to be more than a historical record or a medium to evangelize AV technology.
Tiny but mighty micromobility
It’s been said before, but we’ll say it again. Data is queen. This past week, mobility management startup Passport partnered with Charlotte, N.C., Detroit, Mich. and Omaha, Neb. and Lime to create a framework to apply parking principles, data analysis and more to the plethora of shared micromobility services.
And, in case you missed it, Bird had to let some people go late last week. We’ve learned a few more details since the news broke. That came out to about 40 people out of the ~900-person company. The layoffs were part of Bird’s annual performance review process and only affected U.S.-based employees, TechCrunch learned. Those laid off are eligible for severance, including health and medical benefits. Despite the layoffs, Bird is actively looking to hire for more than 100 positions throughout the company.
Traffic affects more than people. Take a look at the map pictured above. See the red line? That’s Interstate 15 in southern California. To the east, are inland communities and eventually the San Bernardino National Forest and San Jacinto Mountains.
To the west, are the Santa Ana Mountains and an increasingly isolated family of 20 cougars, the Los Angeles Times reports this week. The 15 and the heavy traffic on it is putting pressure on the gene pool. In the past 15 years, at least seven cougars have crossed the 15. Just one sired 11 kittens. This lack of genetic diversity — the lower documented for the species outside of the endangered Florida panther — could have devastating effects on mountain lions here. A study published in the journal Ecological Applications predicts extinction probabilities of 16 percent to 28 percent over the next 50 years for these lions.
In this specific case, the last natural wildlife corridor in the area — and perhaps the difference between survival and extinction — is little Temecula Creek.
This phenomenon is happening in other areas as well, causing communities to toy with possible solutions. One option: shuttling the lions over the other side, a move that could cause all sorts of problems. In other places, such as an area near the Santa Monica Mountains, a wildlife overpass has been proposed.
30 percent of mass transit providers collect fares through a mobile app; only 39 percent have an app at all
26 percent of transit operators say costs are their biggest challenges. Among metro mass transit agencies, that concern jumps to 40 percent
Nearly a quarter (23 percent) of national operators and 24 percent of large transit agencies (1,000 to 10,000 employees) say that implementing mobile technology is their single biggest challenge.
Customer acquisition is the second-most common challenge in US transportation, cited by 23 percent national, 33 percent regional, and 17 percent of private operators.
Other items of note:
The Information’s Amir Efrati has yet another piece on Alphabet’s self-driving car business Waymo. This time Efrati analyzed confidential Waymo customer feedback on 2,500 rides this quarter. The upshot: autonomous ride-hailing services face considerable headwinds in their attempt to replicate Uber and Lyft.
Dockless scooters have been all the rage; now it seems that cities and scooters startups are considering whether free-floating micromobility might need to be reined in a skosh.
Lyft, which has scooters in 13 cities, recently experimented with parking racks. These parking racks or docks are designed specifically for scooters. The company set up these docking stations in Austin during SXSW and released a handy Guide to Good Scootiquette to encourage better and safer rider behavior.
Meanwhile, an industry around scooter management is emerging. Swiftmile, a startup that developed light electric vehicle charging systems for bike share, has new solar-powered charging platforms for scooters. TechCrunch met Swiftmile CEO Colin Roche in Austin earlier this month and learned that a number of cities are interested in deploying these systems. Swiftmile’s system not only charges the scooters, it also can provide scooter companies with diagnostics and keep the device locked in the dock if it’s malfunctioning. The docks can be programmed to lock the scooters up during certain hours — bar closing time would seem like an optimal time — to keep them from being misused. Systems like these could help scooter companies like Bird and Lime extend the life of their scooters and keep local officials happy.
Autonomous street sweepers
ENWAY and Nanyang Technological University are deploying autonomous street sweepers in the inner city of Singapore as part of a project with National Environmental Agency Singapore. The project began this month and will run into September 2020.
Under the pilot, ENWAY’s autonomous sweeper will clean an area of more than 12 kilometers of roads every day. The sweeper is equipped with numerous sensors, including 2D and 3D lidars, 3D cameras, GNSS. The base vehicle is a retrofitted all-electric compact road sweeper from Swiss manufacturer Bucher Municipal.
The company aims to commercialize autonomous cleaning on public ground in Singapore and abroad.
A demo of the sweeper is in the video below.
Silvercar scales up
On the other end of the transportation spectrum, Silvercar by Audi has rolled out a delivery and pick up service in downtown locations in New York and San Francisco. Silvercar customers can request their rental be dropped off and picked up at home or a location of their choosing for an additional fee. Silvercar also announced plans to bring its premium rental experience to Boston at Logan International Airport on April 15.
If you’ve never heard of Silvercar, you’re forgiven. It’s not exactly widespread. The company aims to remove the headache of traditional car rental. I recently tried it out in Austin during SXSW and found that it is convenient, and works pretty well, but doesn’t remove some of the annoying pinch points of car rentals. Yes, there are no lines. When I got off the plane in Austin, I received a message that my car was ready and to hail my driver who picked me up curbside, drove me to the Silvercar operation, and brought me to my Audi. I used the app to unlock the vehicle.
That’s cool. What would be even better is skipping all those steps and being able to access the vehicle right there in the airport without interacting with anyone. (Granted, not everyone wants that) This new delivery and pickup service in New York and San Francisco gets closer to that sweet spot.
On our radar
New York Auto Show is coming up and I’ll be in the city right before the show. But then it’s back to the west coast for TC Sessions: Robotics + AI, a one-day event held April 18 at UC Berkeley. I’ll be interviewing Anthony Levandowski on stage and moderating a panel with Aurora co-founder Sterling Anderson and Uber ATG Toronto chief Raquel Urtasun to talk about building the self-driving stack and how AI is used to help vehicles understand and predict what’s happening in the world around them and make the right decisions.
Also, the PAVE Coalition is hosting its first public demonstration event April 5-7 at the Cobo Center in downtown Detroit. The public will have an opportunity to ride in a self-driving car, and interactive displays will help visitors understand the technology behind self-driving cars and their potential benefits.
Finally, one electric vehicle thing we’ve been following. Columbus, Ohio won the U.S. Department of Transportation’s first-ever Smart City Challenge and we’ve been tracking the city’s progress and its efforts to increase electric vehicle adoption.
One of the organizers told TechCrunch that since the beginning of 2017, the cumulative new EV registrations in the Columbus metropolitan area have increased by 121 percent. New EV registrations over this period outpaced the 82 percent expansion in the Midwest region and the 94 percent growth seen across the U.S. over the same time period.
Thanks for reading. There might be content you like or something you hate. Feel free to reach out to me at email@example.com to share those thoughts, opinions or tips.
Some of us Earthlings may see dancing, green lights in the sky on Saturday night.
The sun blasted out a flare of energized particles into space on March 20, and the National Ocean and Atmospheric Administration’s (NOAA) Space Prediction Center forecasts that a strip of the northern U.S. may experience a visible effect of this event: an aurora, or eerie dancing greenish light, created when the sun’s particles interact with Earth’s atmosphere.
Such an atmospheric event is stoked by a disturbance called a geomagnetic storm, where energized solar particles propel changes in Earth’s magnetosphere — a sprawling zone of space around Earth where the planet’s magnetic field changes and evolves in reaction to the sun.
It often takes a few days for powerful flares from the sun, known as Coronal Mass Ejections (CMEs), to hit Earth and stoke a space storm.
The Space Prediction Center predicts that a curved strip of land in the U.S. between Washington and Maine is the “most likely” extent of the celestial lights, though areas as far south as Colorado may be treated to the aurora.
This furthest extent is forecast to fall between the green and yellow lines seen in the above NOAA graphic, or the tweet below. This means portions of Washington, Idaho, Montana, North Dakota, South Dakota, Minnesota, Iowa, Michigan, Wisconsin, Illinois, New York, Vermont, New Hampshire, and Maine.
A G2 (Moderate) geomagnetic storm watch is in effect for the 23 March, 2019 UTC-day due to anticipated CME arrival. The CME was associated with a C4 flare on 20 March, 2019 at 1118 UTC (0718 EDT). Continue to monitor our SWPC webpage for additional updates. pic.twitter.com/tjZIGFiLSz