Positioning -New | |
Performance analysis of position estimation based on laser scanner and SLAM
In this paper, performance estimation of the position and attitude using the 2D laser scanner has been discussed |
|
|
In areas such as the environment of the city, for reasons such as multipath and GPS signal blocking, accurate position estimation is difficult only GPS system. The study such as GPS/INS, GPS/DR has been performed. Location estimation technique that combines vehiclemounted sensors, camera, laser scanners have been studied here. Especially in recent years, studies for improving the position and attitude estimation performance of the robot and the car with a laser scanner is performed[1-3]. More recently, studies using the Velodyne(3D laser scanner) is a new research [4].
In order to use laser scanner information, it is necessary to extract feature points surrounding environment. This applies equally to the case of using the image information. However, the laser scanner provides intuitive information as angle and distance to nearby landmarks different from the image information. And, as with the error range of a few cm, distance information of laser scanner is very accurate. In this paper, we extract the position of the surrounding landmarks using a laser scanner and applying the SLAM (Simultaneous Localization and Mapping) technology to estimate the position and attitude of the vehicle.
Situation set of experiment
The experimental situation, in an environment where you placed the cylindrical obstacle of about 70cm diameter and obstruction of conical shape of about 30cm in diameter, was set up to run the dot of the speci ed path the vehicle. Figure 1 shows an arrangement of obstacles for the experiment.
In Figure 1, Red stars are obstacles, starting from (0,0), unmanned ground vehicle is traveling between the obstacles in a counter-clockwise direction.
Kalman filter
In this paper, we used the bicycle model. System equation using the bicycle model is as follows.
It represents east, north direction coordinate and the Heading angle, respectively. Input is . It represents the goal speed and the steering angle, respectively. Transition matrix for this is as follows.
Where, L represents the distance between the rear wheel axle and the front axle of the vehicle, Δ θ represents the amount of change in heading angle of the vehicle and is calculated by the following relationship.
Where, n is the number of measurements and θ is a current heading angle of the vehicle. Observation matrix for this is as follows.
Experiment result
Standard deviation of the steering angle and speed input are 3 degrees and 0.3 m/s, respectively. Further, measurement (angle and distance) error was set at 3 degrees and 0.3m, respectively. Figure 2 shows the position errors between the real trajectory and the estimated trajectory. Figure 3 shows the attitude error. Figure 4 shows the driving trajectory. At the experiment result, position error was within 4.5m, attitude error was revealed in 7 degrees or less.
Conclusion
In this paper, we performed an experiment of position and attitude estimation of the vehicle with a laser scanner. At the experiment results, very reliable result was obtained in a short interval. In the future, Study about sensor fusion with GPS, IMU and Vehicle mounted sensors must be carried out for performance advance.
Acknowledgements
This work was supported by a grant from Korea Research Council of Fundamental Science & Technology funded by Ministry of Education, Science and Technology in 2013 [Project : A Study on Satellite based Position Tracking Technology for Calamity Prevention and Public Safety Improvement].
References
[1] Kai Lingemann, Andreas Nuchter, Joachim Hertzberg, Hartmut Surmann, “High-speed laser localization for mobile robots,” Robotics and Autonomous Systems, 2005, pp.275-296
[2] Z. J. Chong, B. Qin, T. Bandyopadhyay, T. Wongpiromsarn, E. S. Rankin, “Autonomous Navigation in Crowded Campus Environments,”
[3] Andreas Geiger, Philip Lenz, Raquel Urtasun, “Are we ready for Autonomous Driving? The KITTI Vision Benchmark Suite,”
[4] Frank Moosmann, Christoph Stiller, “Velodyne SLAM,” 2011 IEEE Intelligent Vehicles Symposium, Baden-Baden, Germany, 2011.
[5] Eduardo Nebot, “Simultaneous Localization and Mapping,” 2002 Summer School, Australian Centre for Field Robotics, The University of Sydney NSW 2006, Australia, 2002.
Leave your response!