While many seem to take it for granted that autonomous vehicles are the future of transportation, projections for when that future will arrive keep getting pushed out. Just within the past year, Ford CEO Jim Hackett said the industry “overestimated the arrival of autonomous vehicles.” And Raquel Urtasun, the chief scientist in charge of Uber’s self-driving efforts, told Reuters.
“The first thing I learned is no timelines, right?”
Such skepticism is no surprise to Nate Ramanathan, Vice President of Operations at AEye. AEye is a pioneer in the artificial perception space for self-driving cars, particularly in the area around iDAR (intelligent Detection and Ranging).
When starting out, the company was focused on two different markets: the robot taxi market and the advanced driver-assistance systems (ADAS) market. As it turns out, the latter is taking the front seat today. “It turns out that the next step for autonomous vehicles is not Level 5 (full) autonomy,” Nate said, “but Level 2.5, Level 3 autonomy. And we’re going to have to ease people into it, because all of a sudden letting the car drive itself, for most people, is kind of a freaky moment.”
“When Luis Dussan started our company many of us were optimistic about focusing on full autonomy right off the bat. But he looked at it differently. ‘This is going to come in stages,’ he told us. So our architecture needs to support that.”
– Nate Ramanathan, Vice President of Operations, AEye
So what does this mean for companies on the cutting edge? First, the company has to go where the market is. Especially in the world of autonomous vehicles, those wishing to compete must be aware of the problems and use cases that have already been solved and begin working on the use cases that will only be solved in the future.
“We have to deal with corner cases, like the kid running out in the middle of the block or the person coming out from behind a car carrying a trash bag,” Nate said.
Secondly, the company has to build an architecture that anticipates the changes to come. AEye is doing this, in part, by building an architecture that leverages edge computing to save both power and time. It is also anticipating the evolution of sensor systems that will provide iDAR with critical data.
Accepting that you can’t put a timeline on the ultimate goal, in this case, full autonomy, forces you to think both in the near term or what is needed today, while building for the future and what will be needed next.