留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

抛撒地雷的夜视智能探测方法研究

王驰 于明坤 杨辰烨 李思远 李富迪 李金辉 方东 栾信群

王驰, 于明坤, 杨辰烨, 李思远, 李富迪, 李金辉, 方东, 栾信群. 抛撒地雷的夜视智能探测方法研究[J]. 中国光学(中英文), 2021, 14(5): 1202-1211. doi: 10.37188/CO.2020-0214
引用本文: 王驰, 于明坤, 杨辰烨, 李思远, 李富迪, 李金辉, 方东, 栾信群. 抛撒地雷的夜视智能探测方法研究[J]. 中国光学(中英文), 2021, 14(5): 1202-1211. doi: 10.37188/CO.2020-0214
WANG Chi, YU Ming-kun, YANG Chen-ye, LI Si-yuan, LI Fu-di, LI Jin-hui, FANG Dong, LUAN Xin-qun. Night vision intelligent detection method of scatterable landmines[J]. Chinese Optics, 2021, 14(5): 1202-1211. doi: 10.37188/CO.2020-0214
Citation: WANG Chi, YU Ming-kun, YANG Chen-ye, LI Si-yuan, LI Fu-di, LI Jin-hui, FANG Dong, LUAN Xin-qun. Night vision intelligent detection method of scatterable landmines[J]. Chinese Optics, 2021, 14(5): 1202-1211. doi: 10.37188/CO.2020-0214

抛撒地雷的夜视智能探测方法研究

doi: 10.37188/CO.2020-0214
基金项目: 国家自然科学基金(No. 41704123,No.61773249);近地面探测技术重点实验室基金( No. TCGZ2020C003)
详细信息
    作者简介:

    王 驰(1982—),男,河南太康人,博士(后),教授,2009年于天津大学获得博士学位,现为上海大学机电工程与自动化学院教师,主要从事精密测试及仪器等方面的研究。E-mail:wangchi@shu.edu.cn

    栾信群(1968—),女,江苏泰州人,硕士,高级工程师,1990年于国防科技大学获学士学位,2006年于西安交通大学获硕士学位,主要从事近地面目标探测技术研究。E-mail:xinqun_luan@126.com

  • 中图分类号: TN247; TN223; TP212.6

Night vision intelligent detection method of scatterable landmines

Funds: Supported by National Natural Science Foundation of China (No. 41704123, No. 61773249); Science and Technology on Near-Surface Detection Laboratory (No. TCGZ2020C003)
More Information
  • 摘要: 本文提出一种基于机器学习的抛撒地雷的夜视智能探测方法。首先,根据YOLO系列机器学习算法,设计并优化了抛撒地雷的智能检测网络模型;其次,根据几何光学成像的相似性原理,研究抛撒地雷的测距模型。最后,搭建抛撒地雷的夜视智能探测系统进行实验测试分析。实验结果显示,优化后抛撒地雷智能探测网络模型的准确度达到98.97%、召回率达到99.22%、均值平均精度为99.2%;在给定的实验条件下,利用优化后的抛撒地雷测距模型,对抛撒地雷的距离测算误差为±10 cm,表明利用机器学习可以用于对抛撒地雷进行智能探测。

     

  • 图 1  YOLO(V2)网络结构图

    Figure 1.  YOLO(V2) network structure diagram

    图 2  模型测试集的PR曲线

    Figure 2.  PR curves of the model’s test set

    图 3  距离测量原理示意图

    Figure 3.  Schematic diagram of the distance measurement principle

    图 4  实验用抛撒地雷

    Figure 4.  Scatterable landmines used in the experiment

    图 5  抛撒地雷智能探测系统图

    Figure 5.  Diagram of the intelligent detection system of scatterable landmines

    图 6  72式防坦克金属地雷

    Figure 6.  Type 72 anti-tank metal landmine

    图 7  背景为平坦地面的地雷

    Figure 7.  Scatterable landmines with flat background

    图 8  背景为草丛的58式防步兵橡胶地雷

    Figure 8.  Type 58 anti-infantry rubber landmine with grass in the background

    图 9  高斯拟合曲线图

    Figure 9.  Gaussian fitting curves

    表  1  训练参数

    Table  1.   Training parameters

    参数名称参数值
    网络权重更新的batch数目64
    网络实际训练细分批次数8
    网络训练图片的宽832
    网络训练图片的高832
    动量参数0.9
    权重衰减系数0.0005
    学习率0.001
    迭代次数100200
    下载: 导出CSV

    表  2  测试集测试时相关指标

    Table  2.   Relevant indexes during test set testing

    Instance numberTureMinesFalseMinesRecallPrecisionMap
    Before optimization3873741196.64%97.14%95.286%
    After optimization387384499.22%98.97%99.2%
    下载: 导出CSV

    表  3  抛撒地雷测距实验数据

    Table  3.   Experimental data of distance measurement for scatterable landmines

    测量
    次数
    激光测距仪
    测量距离/cm
    优化前算法
    测量距离/cm
    误差值/cm误差
    1461.3465.94.60.99%
    2582.0595.313.32.28%
    3641.5662.420.93.26%
    4782.6818.035.44.52%
    5960.51014.954.45.66%
    61083.81155.571.76.62%
    71284.81387.6102.88.00%
    81343.41470.0126.69.42%
    91464.51618.9154.410.54%
    101584.11775.3191.212.07%
    111786.52033.2246.713.81%
    121844.42119.9275.514.94%
    131906.72199.7293.015.37%
    142088.42466.8378.418.12%
    152147.92657.8509.923.73%
    162285.42791.9506.522.16%
    下载: 导出CSV

    表  4  优化算法后抛撒地雷测距实验数据

    Table  4.   Experimental data of the distance between the scatterable landmine and the camera after optimizing the algorithm

    测量次数激光测距仪测量距离/cm优化后算法测量距离/cm误差值/cm误差
    1461.3484.222.94.96%
    2582.0584.52.50.43%
    3641.5640.0−1.5−0.23%
    4782.6775.5−7.1−0.91%
    5960.5954.5−6.0−0.62%
    61083.81082.2−1.6−0.15%
    71284.81283.8−1.0−0.08%
    81343.41351.58.10.60%
    91464.51469.04.50.31%
    101584.11588.24.10.26%
    111786.51784.8−1.7−0.09%
    121844.41851.87.40.40%
    131906.71912.96.20.32%
    142088.42097.08.50.41%
    152147.92153.04.90.23%
    162285.42285.60.20.01%
    下载: 导出CSV
  • [1] DANIELS D J. A review of GPR for landmine detection[J]. Sensing and Imaging:An International Journal, 2006, 7(3): 90-123. doi: 10.1007/s11220-006-0024-5
    [2] LIANG F L, ZHANG H H, WANG Y M, et al.. Landmine-enhanced imaging based on migratory scattering in ultra-wideband synthetic aperture radar[C]. Proceedings of 2013 IEEE International Conference on Signal Processing, Communication and Computing, IEEE, 2013: 1-4.
    [3] KASBAN H, ZAHRAN O, ELARABY S M, et al. A comparative study of landmine detection techniques[J]. Sensing and Imaging:An International Journal, 2010, 11(3): 89-112. doi: 10.1007/s11220-010-0054-x
    [4] ŠIPOŠ D, GLEICH D. A lightweight and low-power UAV-borne ground penetrating radar design for landmine detection[J]. Sensors, 2020, 20(8): 2234. doi: 10.3390/s20082234
    [5] BAUR J, STEINBERG G, NIKULIN A, et al. Applying deep learning to automate UAV-based detection of scatterable landmines[J]. Remote Sensing, 2020, 12(5): 859. doi: 10.3390/rs12050859
    [6] KOSITSKY J, COSGROVE R, AMAZEEN C A, et al. Results from a forward-looking GPR mine detection system[J]. Proceedings of SPIE, 2002, 4742: 2002.
    [7] MONTIEL-ZAFRA V, CANADAS-QUESADA F J, VERA-CANDEAS P, et al. A novel method to remove GPR background noise based on the similarity of non-neighboring regions[J]. Journal of Applied Geophysics, 2017, 144: 188-203. doi: 10.1016/j.jappgeo.2017.07.010
    [8] TAKAHASHI Y, MISAWA T, MASUDA K, et al. Development of landmine detection system based on the measurement of radiation from landmines[J]. Applied Radiation and Isotopes, 2010, 68(12): 2327-2334. doi: 10.1016/j.apradiso.2010.03.021
    [9] BROSINSKY C A, EIRICH R, DIGNEY B L, et al. The application of telematics in the Canadian landmine detection capability[J]. IFAC Proceedings Volumes, 2001, 34(9): 227-233. doi: 10.1016/S1474-6670(17)41710-1
    [10] FREELAND R S, MILLER M L, YODER R E, et al. Forensic application of FM-CW and pulse radar[J]. Journal of Environmental and Engineering Geophysics, 2003, 8(2): 97-103. doi: 10.4133/JEEG8.2.97
    [11] NICKEL U, CHAUMETTE E, LARZABAL P. Estimation of extended targets using the generalized monopulse estimator: extension to a mixed target model[J]. IEEE Transactions on Aerospace and Electronic Systems, 2013, 49(3): 2084-2096. doi: 10.1109/TAES.2013.6558043
    [12] CHRZANOWSKI K. Review of night vision technology[J]. Opto-Electronics Review, 2013, 21(2): 153-181.
    [13] BOURREE L E. Performance of PHOTONIS’ low light level CMOS imaging sensor for long range observation[J]. Proceedings of SPIE, 2014, 9100: 910004.
    [14] GROSS E, GINAT R, NESHER O. Low light level CMOS sensor for night vision systems[J]. Proceedings of SPIE, 2015, 9541: 945107.
    [15] GADE R, MOESLUND T B. Thermal cameras and applications: a survey[J]. Machine Vision and Applications, 2014, 25(1): 245-262. doi: 10.1007/s00138-013-0570-5
    [16] LIU L, OUYANG W L, WANG X G, et al. Deep learning for generic object detection: a survey[J]. International Journal of Computer Vision, 2020, 128(2): 261-318. doi: 10.1007/s11263-019-01247-4
    [17] GIRSHICK R. Fast R-CNN[C]. Proceedings of 2015 IEEE International Conference on Computer Vision, IEEE, 2015: 1440-1448.
    [18] LIU W, ANGUELOV D, ERHAN D, et al.. SSD: single shot MultiBox detector[C]. Proceedings of the 14th European Conference on Computer Vision, Springer, 2016: 21-37.
    [19] REDMON J, FARHADI A. YOLO9000: better, faster, stronger[C]. Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition, IEEE, 2017: 6517-6525.
    [20] FELZENSZWALB P F, GIRSHICK R B, MCALLESTER D, et al. Object detection with discriminatively trained part-based models[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(9): 1627-1645. doi: 10.1109/TPAMI.2009.167
    [21] LU SH Y, WANG B ZH, WANG H J, et al. A real-time object detection algorithm for video[J]. Computers &Electrical Engineering, 2019, 77: 398-408.
    [22] HAMMAM A A, SOLIMAN M M, HASSANIEN A E. Real-time multiple spatiotemporal action localization and prediction approach using deep learning[J]. Neural Networks, 2020, 128: 331-344. doi: 10.1016/j.neunet.2020.05.017
    [23] DOU Q, CHEN H, YU L Q, et al. Multilevel contextual 3-D CNNs for false positive reduction in pulmonary nodule detection[J]. IEEE Transactions on Biomedical Engineering, 2017, 64(7): 1558-1567. doi: 10.1109/TBME.2016.2613502
    [24] DONG ZH, WU Y W, PEI M T, et al. Vehicle type classification using a semisupervised convolutional neural network[J]. IEEE Transactions on Intelligent Transportation Systems, 2015, 16(4): 2247-2256. doi: 10.1109/TITS.2015.2402438
    [25] DIRO A A, CHILAMKURTI N. Distributed attack detection scheme using deep learning approach for Internet of Things[J]. Future Generation Computer Systems, 2018, 82: 761-768. doi: 10.1016/j.future.2017.08.043
    [26] FRIGUI H, ZHANG L J, GADER P, et al. An evaluation of several fusion algorithms for anti-tank landmine detection and discrimination[J]. Information Fusion, 2012, 13(2): 161-174. doi: 10.1016/j.inffus.2009.10.001
    [27] KACHACH R, CAÑAS J M. Hybrid three-dimensional and support vector machine approach for automatic vehicle tracking and classification using a single camera[J]. Journal of Electronic Imaging, 2016, 25(3): 033021. doi: 10.1117/1.JEI.25.3.033021
    [28] HE K M, ZHANG X Y, REN SH Q, et al.. Deep residual learning for image recognition[C]. Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition, IEEE, 2016: 770-778.
    [29] SMIRNOV E A, TIMOSHENKO D M, ANDRIANOV S N. Comparison of regularization methods for ImageNet classification with deep convolutional neural networks[J]. AASRI Procedia, 2014, 6: 89-94. doi: 10.1016/j.aasri.2014.05.013
    [30] ITAKURA K, HOSOI F. Automatic tree detection from three-dimensional images reconstructed from 360° spherical camera using YOLO v2[J]. Remote Sensing, 2020, 12(6): 988. doi: 10.3390/rs12060988
    [31] LECHGAR H, BEKKAR H, RHINANE H. Detection of cities vehicle fleet using Yolo V2 and aerial images[J]. The International Archives of the Photogrammetry,Remote Sensing and Spatial Information Sciences, 2019, XLII-4/W12: 121-126. doi: 10.5194/isprs-archives-XLII-4-W12-121-2019
    [32] KIM C, KIM H M, LYUH C G, et al.. Implementation of Yolo-v2 image recognition and other testbenches for a CNN accelerator[C]. Proceedings of 2019 IEEE 9th International Conference on Consumer Electronics, IEEE, 2019: 242-247.
    [33] WANG L H, YANG Y, SHI J CH. Measurement of harvesting width of intelligent combine harvester by improved probabilistic Hough transform algorithm[J]. Measurement, 2020, 151: 107130. doi: 10.1016/j.measurement.2019.107130
  • 加载中
图(9) / 表(4)
计量
  • 文章访问数:  967
  • HTML全文浏览量:  465
  • PDF下载量:  70
  • 被引次数: 0
出版历程
  • 收稿日期:  2020-12-22
  • 修回日期:  2021-01-14
  • 网络出版日期:  2021-03-27
  • 刊出日期:  2021-09-18

目录

    /

    返回文章
    返回

    重要通知

    2024年2月16日科睿唯安通过Blog宣布,2024年将要发布的JCR2023中,229个自然科学和社会科学学科将SCI/SSCI和ESCI期刊一起进行排名!《中国光学(中英文)》作为ESCI期刊将与全球SCI期刊共同排名!