dc.description.abstract | Background foreground separation (BFS) is a critical computer vision task aimed at distinguishing dynamic foreground objects from static backgrounds. While consumer cameras are widely used due to their affordability, high resolution, and ease of use, they are often prone to failures under varying lighting conditions, reflective surfaces, and occlusions. This paper investigates the use of a cost-effective radar system to enhance the Robust PCA technique for BFS, addressing these common issues. By applying algorithm unrolling, we achieve real-time computation, feedforward inference, and strong generalization, outperforming traditional deep learning approaches. Using the RaDICaL dataset, we show that integrating radar data significantly improves both quantitative performance and qualitative robustness compared to image-based methods, demonstrating enhanced resilience to the typical failure modes encountered with cameras. | en_US |