Brains vs. Brawn: The Quest for Artificial Perception
James Robnett VP of Automotive Business Development, AEye
As the ADAS and robotaxi markets advance and converge, OEMs and mobility players will compete on trust and safety. How can self-driving cars learn to perceive their environment quickly and reliably, when driving more compute power through the system isn’t working?
AI-based sensors are emerging that capture and process smart data at the edge, enabling faster, more accurate perception.
Current metrics don’t adequately measure the capabilities of these AI-based LiDAR systems, nor do they explicitly address real-world problems facing autonomous driving, such as hazard detection and tracking.
New metrics are needed that not only are more advantageous for AV development, but will oblige more robust safety standards across the industry.