Trains, Planes, and Automobiles:

 

The Necessity of Real-Time Processing in Autonomous Vehicles

When thinking about autonomous vehicles, it’s easy to focus on driverless cars, how your commute to work will change, how your drives with the family will be filled with board games instead of “are we there yet?” The technology has been parodied in the HBO show Silicon Valley, is roaming the streets of San Francisco and Hamburg, and is the unforgettable protagonist in a vehicular accident that led to a man’s decapitation.

Whether you are trepidatious or excited, self-driving technology is inevitable. For the folks working to get the technology ready for wide-spread deployment, the trick is making sense of the road in real time, not if the driver’s seat can rotate to face backwards.

Ultimately, the concept of a driverless vehicle is simple, a vehicle that drives itself. But the mechanics of how that vehicle drives all by itself is where the challenge lies. In order to work without human intervention, driverless vehicles are “permanently surveying the road and sending relevant bits of information up to the cloud,” explains Christoph Grote, SVP of Electronics at BMW. But the trick is knowing and catching the “relevant” information. And this is not easy.

AEye, an artificial perception technology company, focuses their work specifically in the space of creating tech that is able to abstract and extract meaningful and necessary information from a car’s environment. Namely, they are creating AI-enabled sensors that enable vehicles to “think like a robot, perceive like a human” – seeing, classifying, and responding to an object, whether it is a parked car, or a child crossing the street, in real time and before it’s too late.

What AEye understood, is that in order to create technology that can perceive like a person and think like a robot they needed a rocket scientist (almost) literally. The Chief Scientist at AEye, Dr. Allan Steinhardt, serves this function. Dr. Steinhardt is an expert on radar and missile defense, including ground moving target indicator radars (so, a radar and missile scientist if you want to get nit-picky about the rocket analogy); space surveillance; and he is the former Chief Scientist for DARPA.

At this year’s BootstrapLabs Applied Artificial Intelligence Conference, Dr. Steinhardt delivers a keynote on how transportation AI is uniquely different from other applications of AI. In addition to exploring how AI is augmenting the collection and interpretation of environmental data the car must process, he will also address the very environments these autonomous vehicles will be driving in: the cities of the future.

It is easy to get fixated on the gadget. We are primed to want the newest artifact of technology, be it the newest phone, scooter-rideshare, or smart speaker. What is easily forgotten is the network that these technologies rely on. Up until now, the landscape of things needing internet connectivity has been able to rely on either WiFi or 4G. Autonomous vehicles, and the reams of data they are reliant on processing and distributing, require much more bandwidth. Christoph Grote explains that driverless cars are continuously gathering data from their environments, “generating a true, real-time map of a car that is pushed down to all the other cars…autonomous driving is not the capability of a car, it is really about swarm intelligence.” And in order to do this, autonomous vehicles will need to be able to connect to 5G.  

The cities of the future will be increasingly optimized to support automated functionality, including autonomous driving. But how, you ask? Join Dr. Allan Steinhardt’s keynote at AAI19 to find out.