Self-Localization of a Mobile Robot Based on the Hybrid Extended Kalman Filter and Computer Vision
- 주제(키워드) Mobile Robot
- 발행기관 고려대학교 대학원
- 지도교수 Myo-taeg Lim
- 발행년도 2009
- 제출일 2008-12-03
- 학위수여년월 2009. 2
- 학위명 석사
- 학과 공학대학원 전자·컴퓨터공학전공
- 원문페이지 66 p
- 실제URI http://www.dcollection.net/handler/korea/000000007168
- 본문언어 영어
- 제출원본 000045535008
초록/요약
Self-localization is an essential capability for Mobile Robots navigating in known and unknown environments. For the tricycle mobile robot type, Extended Kalman Filter has been served as a powerful tool to update and estimate positions and orientations when navigation. In this thesis, we present an improving method for solving localization problems with a highly accurate model of a mobile robot either in an uncertainly large-scale environment. The first phase of our approach is to analyze intensively the dead-reckoning model for the tricycle robot type. We propose the localization algorithm based on a Hybrid Extended Kalman Filter using artificial beacons. Then, the simulations of each observation are taken and the odometry data is updated to estimate the robot position. A comparison between the real and the estimated location of beacons and analyzing of the filter’s performance are taken using 3600 sensor scan. The simulation results show that the proposed algorithm can lead the robot to robustly navigate in uncertain environments. The second phase is we apply the acquired tricycle robot model and the proposed algorithm above to experiments. In this phase, the 3600 sensor is replaced by a series of sonar sensors and a single-CCD camera is also integrated on the mobile robot to compute the position of correspondences in 3D space base on their projections on camera image. By comparing sparse prerecord images taken during navigation; the robot is able to update their position and orientation more precisely. The experiments taken using the Pioneer 3-DX mobile robot show the good performance of image matching for globalization.
more목차
CONTENTS III
LIST OF FIGURES V
LIST OF TABLES VII
LIST OF ACRONYMS VIII
1. INTRODUCTION 1
1.1. MOBILE ROBOT OVERVIEW 1
1.2. LOCALIZATION PERCEPTION 2
1.3. THE EXTENDED KAMAN FILTER 6
2. MOBILE ROBOT AND BEACON OBSERVATION MODEL 10
2.1. MOBILE ROBOT MODEL 10
2.1.1. Robot pose representation 10
2.1.2. Calibration of Systematic Error 12
2.1.3. Odometry Error Model 13
2.1.4. Robot translation 16
2.2. BEACON OBSERVATION ERROR MODEL 18
2.2.1. Measurement update 18
2.2.2. Maximum likelihood correspondence 19
BEACONS IN THE MAP. 20
3. HEKF-BASED OPTIMAL ESTIMATION AND SSU ALGORITHM 21
3.1. HYBRID EXTENDED KALMAN FILTER APPROACH 21
3.2. SSU ALGORITHM 23
4. SIMULATION RESULTS 25
5. VISION BASED LOCALIZATION 29
5.1. EXTRACT FEATURES FROM SEQUENT IMAGES 29
5.1.1. Mathematical approach 29
5.1.2. Refine matching correspondences using optical flow field 32
5.1.3. Camera calibration 37
5.2. SYNTHESIZE VISUAL SENSOR TO SYSTEM 40
5.2.1. Covariance calculation of optical feature 42
6. EXPERIMENTS 45
7. CONCLUSION 52
8. BIBLIOGRAPHY 53
9. ACKNOWLEDGMENTS 55

