1. A reliable NLOS error identification method based on LightGBM driven by multiple features of GNSS signals
- Author
-
Xiaohong Zhang, Xinyu Wang, Wanke Liu, Xianlu Tao, Yupeng Gu, Hailu Jia, and Chuanming Zhang
- Subjects
Urban environment ,GNSS signal feature ,Non-line-of-sight identification ,LightGBM ,Fisheye camera ,Technology (General) ,T1-995 - Abstract
Abstract In complicated urban environments, Global Navigation Satellite System (GNSS) signals are frequently affected by building reflection or refraction, resulting in Non-Line-of-Sight (NLOS) errors. In severe cases, NLOS errors can cause a ranging error of hundreds of meters, which has a substantial impact on the precision and dependability of GNSS positioning. To address this problem, we propose a reliable NLOS error identification method based on the Light Gradient Boosting Machine (LightGBM), which is driven by multiple features of GNSS signals. The sample data are first labeled using a fisheye camera to classify the signals from visible satellites as Line-of-Sight (LOS) or NLOS signals. We then analyzed the sample data to determine the correlation among multiple features, such as the signal-to-noise ratio, elevation angle, pseudorange consistency, phase consistency, Code Minus Carrier, and Multi-Path combined observations. Finally, we introduce the LightGBM model to establish an effective correlation between signal features and satellite visibility and adopt a multifeature-driven scheme to achieve reliable identification of NLOSs. The test results show that the proposed method is superior to other methods such as Extreme Gradient Boosting (XGBoost), in terms of accuracy and usability. The model demonstrates a potential classification accuracy of approximately 90% with minimal time consumption. Furthermore, the Standard Point Positioning results after excluding NLOSs show the Root Mean Squares are improved by 47.82%, 56.68%, and 36.68% in the east, north, and up directions, respectively, and the overall positioning performance is significantly improved.
- Published
- 2024
- Full Text
- View/download PDF