Sensors used in autonomous driving are affected variably in adverse weather conditions. The reliability of each sensor changes with different lighting, precipitation type and intensity, and visibility conditions. Common sensing modalities in autonomous driving include cameras and LiDARs, which have a high resolution signal, but degrade in poor weather conditions. Camera image contrast is also sensitive to changes in lighting. On the other hand, RADARs have a long range, but relatively low spatial resolution. Nonetheless, they are resilient to varying lighting and weather fluctuations. A compact weather detection system can be used to dynamically steer other algorithms used in autonomous vehicles (e.g., vehice control, object detection and tracking) toward the most reliable sensors. By adjusting the weights for different sensors, a fusion scheme can be made to change focus to the currently best performing sensor combination. Alternatively, a weather detection system could be used to switch between weather-specific models or ensemble algorithms. The idea of using multi-model algorithms for autonomous driving has been gaining popularity recently; a model trained in sunny conditions will likely underperform in non-ideal weather conditions. This paper presents a compact Convolutional Neural Network (CNN) that detects current driving weather conditions based on a narrow strip of the grayscale forward facing camera image. Standard multi-class classification metrics are used to assess the performance. The weighted average of the f1-score of all classes was 94%. The model was trained and tested on the RADIATE dataset, which contains multimodal sensor data of driving in different weather conditions including sunny, snow, overcast, fog, and rain.
|