TU-Automotive Europe is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

Informa

The State of Self driving – How to Solve Perception Through Intelligent Sensing?

Interview with Peter Szelei, AEye Director, Business Development, EMEA

WHAT ARE THE MAIN CHALLENGES TO MOVING BEYOND LEVEL 3 AUTOMATION?

To move past level 3 autonomy, where the system is in charge and the automaker accepts liability, automakers need certainty - certainty that cars sans humans will quickly and accurately interpret their environment and respond to danger.

This requires fast, accurate perception, and it requires solving the toughest corner cases: things like a plastic bag floating in the freeway, a car swerving or a child chasing a ball into the street. Level 3+ demands that cars perceive as well as or better than a human - a tall order that will require automakers to move beyond camera and radar to include LiDAR in the mix. While cameras and radar are sufficient for Level 2 ADAS, LiDAR, which provides higher resolution, better performance in diverse weather conditions,  and more accurate measurement will be a critical component of mobility deployments.  In addition, all of these sensors need to be made more intelligent and actively integrated so that they can trigger and reinforce each other.

Beyond perception, other challenges include business models, infrastructure,  and legal ramifications. Bringing L4 cars to market requires a whole new approach to pricing solutions. All of the new components, be they hardware or software, need to be paid for by someone, on top of the usual parts.  In addition, there will be an extended period of time where AVs and human drivers will share the road.  How does the infrastructure need to evolve to support both safely?  What additional investments should we be making in our transportation infrastructure now to ensure we realize the promise of increased safety and efficiency?  Lastly, from a legal perspective, automotive players must navigate liability-related uncertainty to determine who is at fault.

HOW DOES AEYE TACKLE THIS CHALLENGE HEAD ON?

To create a perception system that acquires better quality data, faster, requires a disruptive way of thinking. Whereas LiDAR is a passive sensor system which lacks intelligence - gathering information indiscriminately about its environment, regardless of evolving conditions or competing priorities, iDAR (Intelligent Detection and Ranging) is AEye’s breakthrough perception innovation which enables self-driving cars to intelligently assess hazards and respond to changing conditions faster, with greater accuracy and reliability. iDAR fuses solid-state agile LiDAR with a low-light HD camera, then integrates artificial intelligence, to create a smart sensor that is fully software definable.

Unlike standard LiDAR, AEye’s agile iDAR is situationally adaptive (at either design time or run time) so that it can modify scan patterns and trade resources such as power, update rate, resolution, and range. This enables iDAR to dynamically utilize sensor resources to optimally search a scene, efficiently identify and acquire critical objects, such as a child walking into the street or a car entering an intersection, and determine the appropriate course of action. Doing this in real-time is the difference between a safe journey and an avoidable tragedy.

We’ve done extensive research and field studies to determine the most important corner cases the industry must solve to get beyond level 3 autonomy, and we’ve found that, of the top 50 use cases, LiDAR is required to address more than half of  them, while iDAR is essential in at least 16 scenarios. We firmly believe that safe, reliable level 3+ autonomy will require intelligent sensing. As far as the legal and business model challenges, more intelligent systems will be able to better support liability related uncertainty, while - from a business model perspective - AEye is disrupting the market with high performance, automotive quality and costs that are unprecedented for the capabilities offered, which will only come down further with volume.

IN JUST 2 SENTENCES...

...WHAT CAN OUR AUDIENCE EXPECT TO LEARN FROM YOUR THE STATE OF SELF DRIVING - HOW TO SOLVE PERCEPTION THROUGH INTELLIGENT SENSING SESSION?

The audience will have a chance to learn how AEye’s iDAR goes beyond LiDAR to enable intelligent sensing, which plays a critical role in helping cars to understand context, minimize false positives and reduce latency - speeding detection and classification. They will also hear about the toughest corner cases (compex, dangerous and accident scenarios) for perception to solve, and learn how both LiDAR and iDAR respond to these scenarios.

WHAT WILL BE THE NUMBER 1 TAKEAWAY FOR THE AUDIENCE BE?

Putting smarts at the sensors enables vehicles to get to reliable perception faster

WHAT ARE YOU MOST LOOKING FORWARD TO AT TU-AUTOMOTIVE EUROPE?

It is always nice to have an opportunity to catch-up with colleagues across the industry about the latest trends, their plans, and to hear their perspectives on the latest challenges that we need to overcome as an industry.

TU Auto is a great show for hearing key players share their views, and I am looking forward to getting feedback on our latest product that we are going to demonstrate on site.