WHAT ARE THE MAIN CHALLENGES TO MOVING BEYOND LEVEL 3 AUTOMATION?
To move past level 3 autonomy, where the system is in charge and the automaker accepts liability, automakers need certainty - certainty that cars sans humans will quickly and accurately interpret their environment and respond to danger.
This requires fast, accurate perception, and it requires solving the toughest corner cases: things like a plastic bag floating in the freeway, a car swerving or a child chasing a ball into the street. Level 3+ demands that cars perceive as well as or better than a human - a tall order that will require automakers to move beyond camera and radar to include LiDAR in the mix. While cameras and radar are sufficient for Level 2 ADAS, LiDAR, which provides higher resolution, better performance in diverse weather conditions, and more accurate measurement will be a critical component of mobility deployments. In addition, all of these sensors need to be made more intelligent and actively integrated so that they can trigger and reinforce each other.
Beyond perception, other challenges include business models, infrastructure, and legal ramifications. Bringing L4 cars to market requires a whole new approach to pricing solutions. All of the new components, be they hardware or software, need to be paid for by someone, on top of the usual parts. In addition, there will be an extended period of time where AVs and human drivers will share the road. How does the infrastructure need to evolve to support both safely? What additional investments should we be making in our transportation infrastructure now to ensure we realize the promise of increased safety and efficiency? Lastly, from a legal perspective, automotive players must navigate liability-related uncertainty to determine who is at fault.
HOW DOES AEYE TACKLE THIS CHALLENGE HEAD ON?
To create a perception system that acquires better quality data, faster, requires a disruptive way of thinking. Whereas LiDAR is a passive sensor system which lacks intelligence - gathering information indiscriminately about its environment, regardless of evolving conditions or competing priorities, iDAR (Intelligent Detection and Ranging) is AEye’s breakthrough perception innovation which enables self-driving cars to intelligently assess hazards and respond to changing conditions faster, with greater accuracy and reliability. iDAR fuses solid-state agile LiDAR with a low-light HD camera, then integrates artificial intelligence, to create a smart sensor that is fully software definable.
Unlike standard LiDAR, AEye’s agile iDAR is situationally adaptive (at either design time or run time) so that it can modify scan patterns and trade resources such as power, update rate, resolution, and range. This enables iDAR to dynamically utilize sensor resources to optimally search a scene, efficiently identify and acquire critical objects, such as a child walking into the street or a car entering an intersection, and determine the appropriate course of action. Doing this in real-time is the difference between a safe journey and an avoidable tragedy.
We’ve done extensive research and field studies to determine the most important corner cases the industry must solve to get beyond level 3 autonomy, and we’ve found that, of the top 50 use cases, LiDAR is required to address more than half of them, while iDAR is essential in at least 16 scenarios. We firmly believe that safe, reliable level 3+ autonomy will require intelligent sensing. As far as the legal and business model challenges, more intelligent systems will be able to better support liability related uncertainty, while - from a business model perspective - AEye is disrupting the market with high performance, automotive quality and costs that are unprecedented for the capabilities offered, which will only come down further with volume.