It is very difficult for visually impaired people (VIP) to perceive and avoid obstacles at a distance. To address this problem, we propose a sensor fusion system, which combines the RGB-depth (RGB-D) sensor and millimeter wave (MMW) radar sensor, to perceive the surrounding obstacles. The position and velocity information of the multiple targets are detected by the MMW radar based on the principle of frequency modulated continuous wave. The depth and position information of the obstacles are verified by the RGB-D sensor based on the MeanShift algorithm. The data fusion based on the joint probabilistic data association algorithm and Kalman filter enable the navigation assistance system to obtain more accurate state estimates compared with using only one sensor. The nonsemantic stereophonic interface is utilized to transfer the obstacle detection results to the VIP. The experiment results show that multiple objects with different ranges and angles are detected by the radar and the RGB-D sensor. The effective detection range is expanded up to 80 m compared to using only the RGB-D sensor. Moreover, the measurement results are stable under diverse illumination conditions. As a wearable system, the sensor fusion system has the characteristics of versatility, portability, and cost-effectiveness.
According to the data from the World Health Organization, 285 million people are estimated to be visually impaired worldwide, and 39 million are blind. It is very difficult for visually impaired people to perceive and avoid obstacles at a distance during their travelling. To address this problem, we propose a sensor fusion system, which combines the RGBDepth sensor and millimeter wave radar sensor, to detect the surrounding obstacles. The range and velocity of multiple obstacles are acquired by the millimeter wave radar based on the principle of frequency modulated continuous wave. The positions of the obstacles are verified by the RGB-Depth sensor based on the contour extraction and MeanShift algorithm. The data fusion algorithm based on particle filters obtains accurate state estimation by fusing RGB-Depth data with millimeter wave radar data. The experiment results show that multiple obstacles with different ranges and angles are successfully detected by the proposed system. The measurement uncertainties are reduced by the data fusion system, meanwhile the effective detectable range is expanded compared to the detection with only RGB-Depth sensor. Moreover, the measurement results are stable when the illumination varies. As a wearable prototype, the sensor fusion system has the characteristics of versatility, portability and cost-effectiveness, which is very suitable for blind navigation application.
Detecting and reminding of crosswalks at urban intersections is one of the most important demands for people with visual impairments. A real-time crosswalk detection algorithm, adaptive extraction and consistency analysis (AECA), is proposed. Compared with existing algorithms, which detect crosswalks in ideal scenarios, the AECA algorithm performs better in challenging scenarios, such as crosswalks at far distances, low-contrast crosswalks, pedestrian occlusion, various illuminances, and the limited resources of portable PCs. Bright stripes of crosswalks are extracted by adaptive thresholding, and are gathered to form crosswalks by consistency analysis. On the testing dataset, the proposed algorithm achieves a precision of 84.6% and a recall of 60.1%, which are higher than the bipolarity-based algorithm. The position and orientation of crosswalks are conveyed to users by voice prompts so as to align themselves with crosswalks and walk along crosswalks. The field tests carried out in various practical scenarios prove the effectiveness and reliability of the proposed navigation approach.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.