研究生: |
劉良謙 Liu, Liang-Chien |
---|---|
論文名稱: |
高速公路之汽車前方防撞輔助系統 Forward Collision Avoidance Assist System for Vehicles on Highways |
指導教授: |
方瓊瑤
Fang, Chiung-Yao |
學位類別: |
碩士 Master |
系所名稱: |
資訊工程學系 Department of Computer Science and Information Engineering |
論文出版年: | 2015 |
畢業學年度: | 103 |
語文別: | 英文 |
論文頁數: | 52 |
中文關鍵詞: | 前方防撞輔助系統 、影像基底系統 、道路標線偵測 、車輛追蹤 、距離估計公式 、影像處理 |
英文關鍵詞: | forward collision avoidance assist system, vision-based, lane marking detection, vehicle tracking, distance estimation, image processing |
論文種類: | 學術論文 |
相關次數: | 點閱:169 下載:6 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
此論文提出了一個結合了道路標線偵測、車輛追蹤,以及距離估計技術的前方防撞輔助系統(FCAAS)。首先,道路標線偵測技術使用RANSAC演算法從被steerable filter處理過的IPM影像中取出道路標線延伸所得的直線,並採用Kalman filter來追蹤取出的直線。再者,車輛追蹤技術用particle filter實作出多重車輛追蹤的方法,此方法會針對由adaboost 分類器偵測到的車輛進行個別的追蹤。本論文改進了particle filter的取樣方式,使得particle filter能夠更準確的框出影像中的車輛,且減少了每次所需取樣的particle數量。除此之外,此論文推導出一個新穎的距離估計(DE)公式來計算自身車輛與其前方車輛的距離。DE公式經過了審慎的驗證,即運用道路標線規定之長度來推算影像中道路標線位置與真實距離的關係。此驗證得以證明DE公式符合在真實環境下的需求。FCAAS透過許多的高速公路實驗影片展現其在實際場景下正確運作的潛力,且符合即時系統的需求,執行速度可達每秒二十二張frames。
This paper proposes a novel forward collision avoidance assist system (FCAAS) containing techniques of lane marking detection, vehicle tracking and distance estimation (DE). First, a lane marking detection technique uses a RANSAC algorithm to extract lines of lane markings, which were previously collected from an Inverse Perspective Mapping (IPM) image filtered by steerable filters. A Kalman filter then tracks the extracted lines accurately and efficiently. Second, a vehicle tracking technique implements a multiple vehicle tracking method using a particle filter, which tracks the vehicles detected by an AdaBoost classifier. An improved particle filter is implemented to predict the next movement of a vehicle and spread the particles near the predicted location of the vehicle instead of originally spreading the particles around the current location of the vehicle. Finally, an innovative DE formula is derived to estimate the distance between the ego vehicle and the front vehicle. The DE formula is verified by setting several standard points in the image, whose locations can be measured according to the regulation of lane markings. As a result, verification of the DE formula demonstrates a robust feasibility in reality. The FCAAS shows its potential in particular scenes through many experimental sequences acquired from highways in the real world. In addition, the FCAAS fits the demand of a real-time speed system with speeds of 22 frames per second.
[Alo08] J. D. Alonso, E. Ros Vidal, A. Rotter, and M. Muhlenberg, “Lane-change decision aid system based on motion-driven vehicle tracking,” IEEE Transactions on Vehicle Technology, vol. 57, no. 5, pp. 2736–2746, 2008.
[Cha10] W. C. Chang and C. W. Cho, “Online boosting for vehicle detection,” IEEE Transactions on Systems, Man and Cybernetics Part B (Cybernetics), vol. 40, no. 3, pp. 892–902, 2010.
[Che07] S. Cheng and M. Trivedi, “Lane tracking with omnidirectional cameras: Algorithms and evaluation,” EURASIP Journal on Embedded Systems, vol. 2007, no. 1, p. 5, 2007.
[Fre91] W. T. Freeman and E. H. Adelson, “The design and use of steerable filters,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, no. 9, pp. 891–906, 1991.
[Fre99] Y. Freund and R. E. Schapire, “A short introduction to boosting,” Journal of Japanese Society for Artificial Intelligence, vol. 14, no. 5, pp. 771–780, 1999.
[Has09] A. Haselhoff and A. Kummert, “An evolutionary optimized vehicle tracker in collaboration with a detection system,” in Proceedings of IEEE International Conference on Intelligent Transportation Systems, St. Louis, MO, pp. 1–6, 2009.
[Jaz11] A. Jazayeri, H. Cai, J. Y. Zheng, and M. Tuceryan, “Vehicle detection and tracking in car video based on motion model,” IEEE Transactions on Intelligent Transportation Systems, vol. 12, no. 2, pp. 583–595, 2011.
[Kim08] Z. Kim, “Robust lane detection and tracking in challenging scenarios,” IEEE Transactions on Intelligent Transportation Systems, vol. 9, no. 1, pp. 16–26, 2008.
[Lin12] B. F. Lin, Y. M. Chan, L. C. Fu, P. Y. Hsiao, L. A. Chuang, S. S. Huang, and M. F. Lo, “Integrating appearance and edge features for sedan vehicle detection in the blind-spot area,” IEEE Transactions on Intelligent Transportation Systems, vol. 13, no. 2, pp. 737–747, 2012.
[Liu07] W. Liu, X. Wen, B. Duan, H. Yuan, and N. Wang, “Rear vehicle detection and tracking for lane change assist,” in Proceedings of IEEE International Conference on Intelligent Vehicle Symposium (IV), Istanbul, pp. 252–257, 2007.
[Mcc06] J. McCall and M. Trivedi, “Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation,” IEEE Transactions on Intelligent Transportation Systems, vol. 7, no. 1, pp. 20–37, 2006.
[Mei11] X. Mei and H. Ling, “Robust visual tracking and vehicle classification via sparse representation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 11, pp. 2259–2272, 2011.
[Nik12] H. T. Niknejad, A. Takeuchi, S. Mita, and D. McAllester, “On-road multivehicle tracking using deformable object model and particle filter with improved likelihood estimation,” IEEE Transactions on Intelligent Transportation Systems, vol. 13, no. 2, pp. 748–758, 2012.
[Sat10] S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in Proceedings of IEEE International Conference on Intelligent Vehicles Symposium (IV), San Diego, CA, pp. 849–854, 2010.
[Siv10] S. Sivaraman and M. Trivedi, “A general active-learning framework for on-road vehicle recognition and tracking,” IEEE Transactions on Intelligent Transportation Systems, vol. 11, no. 2, pp. 267–276, 2010.
[Siv13-1] S. Sivaraman and M. Trivedi, “Looking at vehicles on the road: a survey of vision-based vehicle detection, tracking, and behavior analysis,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no. 4, pp. 1773–1795, 2013.
[Siv13-2] S. Sivaraman and M. Trivedi, “Integrated lane and vehicle detection, localization, and tracking: a synergistic approach,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no. 2, pp. 906–917, 2013.
[Sun06] Z. Sun and G. Bebis, “Monocular precrash vehicle detection: features and classifiers,” IEEE Transactions on Image Processing, vol. 15, no. 7, pp. 2019–2034, 2006.
[Vio01] P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” in Proceedings of IEEE Conference in Computer Vision and Pattern Recognition, Kauai, HI, vol. 1, pp. 511–518, 2001.
[Wan08] C. C. Wang and J. J. Lien, “Automatic vehicle detection using local features: a statistical approach,” IEEE Transactions on Intelligent Transportation Systems, vol. 9, no. 1, pp. 83–96, 2008.
[Wel95] G. Welch and G. Bishop, “An introduction to the Kalman filter”, Technical Report TR95-041, University of North Carolina at Chapel Hill, 1995.
[Yua11] Q. Yuan, A. Thangali, V. Ablavsky, and S. Sclaroff, “Learning a family of detectors via multiplicative kernels,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 3, pp. 514–530, 2011.
[Zhu05] Y. Zhu, D. Comaniciu, V. Ramesh, M. Pellkofer, and T. Koehler, “An integrated framework of vision-based vehicle detection with knowledge fusion,” in Proceedings of IEEE International Conference on Intelligent Vehicles Symposium (IV), Las Vegas, NV, pp. 199–204, 2005.
[Zhu06] Y. Zhu, D. Comaniciu, M. Pellkofer, and T. Koehler, “Reliable detection of overtaking vehicles using robust information fusion,” IEEE Transactions on Intelligent Transportation Systems, vol. 7, no. 4, pp. 401–414, 2006.
[1] “Highways - Wikipedia, the free encyclopedia,” Wikipedia, Available at: http://en.wikipedia.org/wiki/Highways#History, Accessed 2014.
[2] “Accident and Casualties by Location and Road Type,” Highways Agency, Available at: http://www.highways.gov.uk/specialist-information/safety- operational-folder/annex-5/annex-5-national-accident-data-accidents-and-casualties-by-location-and-road-type/, Accessed 2014.
[3] “Examination and analysis report of accidents on highways in 2012,” Taiwan Area National Freeway Bureau. MOTC R.O.C., Available at: https://www.freeway.gov.tw/UserFiles/101%E5%B9%B4%E5%9C%8B%E9%81%93%E4%BA%8B%E6%95%85%E6%AA%A2%E8%A8%8E%E5%88%86%E6%9E%90(%E7%B6%B2%E9%A0%81).pdf, Accessed 2014.
[4] “Traffic Safety Information”, Taiwan Area National Freeway Bureau. MOTC R.O.C., Available at: http://www.freeway.gov.tw/Publish.aspx? cnid=516&p=2230, Accessed 2014.
[5] “General Rules, Techniques and Advice for All Drivers and Rider (103 to 158)”, GOV.UK, available at: https://www.gov.uk/general-rules-all-drivers -riders-103-to-158/ control-of-the-vehicle-117-to-126, Accessed 2014.
[6] “Sensor Setup, The KITTI Vision Benchmark Suite”, Karlsruhe Institute of Technology, available at: http://www.cvlibs.net/datasets/kitti/setup.php, Accessed 2014.
[7] “RANSAC - Wikipedia, the free encyclopedia,” Wikipedia, Available at: http://en.wikipedia.org/wiki/RANSAC, Accessed 2015.
[8] “Caltech Lane Detection Software,” California Institute of Technology, available at: http://vision.caltech.edu/malaa/software/research/caltech-lane- detection/, Accessed 2014.