Volume 17 Issue 2
Mar.  2024
Turn off MathJax
Article Contents
WEI Rui-li, WANG Ming-jun, ZHOU Yi-ming, YI Fang. Modeling and correction of measurement errors based on depth cameras[J]. Chinese Optics, 2024, 17(2): 271-277. doi: 10.37188/CO.2023-0047
Citation: WEI Rui-li, WANG Ming-jun, ZHOU Yi-ming, YI Fang. Modeling and correction of measurement errors based on depth cameras[J]. Chinese Optics, 2024, 17(2): 271-277. doi: 10.37188/CO.2023-0047

Modeling and correction of measurement errors based on depth cameras

doi: 10.37188/CO.2023-0047
Funds:  Supported by the National Natural Science Foundation of China (Grant No. 92052106, No. 61771385, No. 62101313) and Shaanxi Province Science Foundation for Distinguished Young Scholars (Grant No. 2020JC-42)
More Information
  • Corresponding author: wangmingjun@xaut.edu.cn
  • Received Date: 23 Mar 2023
  • Rev Recd Date: 19 Apr 2023
  • Available Online: 13 Sep 2023
  • Time of Flight (ToF) depth camera is one of the important means to obtain three-dimensional point cloud data, but ToF depth camera is limited by its own hardware and external environment, and its measurement data has certain errors. Aiming at the unsystematic error of ToF depth camera, this paper experimentally verifies that the color, distance, and relative motion of the measured target affect the data obtained by the depth camera, and the error effects are different. A new measurement error model is proposed to correct the error caused by color and distance. For the error caused by relative motion, a three-dimensional motion blur function is established to recover it. Through the numerical analysis of the established calibration model, the residual error of distance and color is less than 4 mm, and the error caused by relative motion is less than 0.7 mm. The work done in this paper improves the quality of the measurement data of the ToF depth camera, and provides more accurate data support for 3D point cloud reconstruction and other work.

     

  • loading
  • [1]
    CHIABRANDO F, CHIABRANDO R, PIATTI D, et al. Sensors for 3D imaging: metric evaluation and calibration of a CCD/CMOS time-of-flight camera[J]. Sensors, 2009, 9(12): 10080-10096. doi: 10.3390/s91210080
    [2]
    CATTINI S, CASSANELLI D, DI LORO G, et al. Analysis, quantification, and discussion of the approximations introduced by pulsed 3-D LiDARs[J]. IEEE Transactions on Instrumentation and Measurement, 2021, 70: 7007311.
    [3]
    JUNG S, LEE Y S, LEE Y, et al. 3D reconstruction using 3D registration-based ToF-stereo fusion[J]. Sensors, 2022, 22(21): 8369. doi: 10.3390/s22218369
    [4]
    LI Y F, GAO J, WANG X X, et al. Depth camera based remote three-dimensional reconstruction using incremental point cloud compression[J]. Computers and Electrical Engineering, 2022, 99: 107767. doi: 10.1016/j.compeleceng.2022.107767
    [5]
    WANG X Q, SONG P, ZHANG W Y, et al. A systematic non-uniformity correction method for correlation-based ToF imaging[J]. Optics Express, 2022, 30(2): 1907-1924. doi: 10.1364/OE.448029
    [6]
    王明星, 郑福, 王艳秋, 等. 基于置信度的飞行时间点云去噪方法[J]. 红外技术,2022,44(5):513-520. doi: 10.11846/j.issn.1001-8891.2022.5.hwjs202205010

    WANG M X, ZHENG F, WANG Y Q, et al. Time-of-flight point cloud denoising method based on confidence level[J]. Infrared Technology, 2022, 44(5): 513-520. (in Chinese) doi: 10.11846/j.issn.1001-8891.2022.5.hwjs202205010
    [7]
    KEPSKI M, KWOLEK B. Fall detection using ceiling-mounted 3D depth camera[C]. International Conference on Computer Vision Theory and Applications, IEEE, 2014: 640-647.
    [8]
    LEE S, KIM J, LIM H, et al. Surface reflectance estimation and segmentation from single depth image of ToF camera[J]. Signal Processing:Image Communication, 2016, 47: 452-462. doi: 10.1016/j.image.2016.07.006
    [9]
    CHIABRANDO F, PIATTI D, RINAUDO F. SR-4000 ToF camera: further experimental tests and first applications to metric surveys[J]. Remote Sensing and Spatial Information Sciences, 2010, 38: 149-154.
    [10]
    JIMENEZ D, PIZARRO D, MAZO M. Single frame correction of motion artifacts in PMD-based time of flight cameras[J]. Image and Vision Computing, 2014, 32(12): 1127-1143. doi: 10.1016/j.imavis.2014.08.014
    [11]
    AHMED F, CONDE M H, MARTÍNEZ P L, et al. Pseudo-passive time-of-flight imaging: simultaneous illumination, communication, and 3D sensing[J]. IEEE Sensors Journal, 2022, 22(21): 21218-21231. doi: 10.1109/JSEN.2022.3208085
    [12]
    HASSAN M, EBERHARDT J, MALORODOV S, et al. Robust Multiview 3D pose estimation using time of flight cameras[J]. IEEE Sensors Journal, 2022, 22(3): 2672-2684. doi: 10.1109/JSEN.2021.3133108
    [13]
    WEYER C A, BAE K H, LIM K, et al. Extensive metric performance evaluation of a 3D range camera[J]. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences Sens, 2008, 37: 939-944.
    [14]
    MURE-DUBOIS J, HÜGLI H. Real-time scattering compensation for time-of-flight camera[C]. 5th International Conference on Computer Vision Systems (ICVS), ICVS, 2007: 1-12.
    [15]
    李胜荣. 动态模糊图像复原研究[J]. 信息与电脑,2021,33(9):46-49. doi: 10.3969/j.issn.1003-9767.2021.09.015

    LI SH R. Study of dynamic MODULUS and image restoration[J]. China Computer & Communication, 2021, 33(9): 46-49. (in Chinese) doi: 10.3969/j.issn.1003-9767.2021.09.015
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(10)  / Tables(1)

    Article views(194) PDF downloads(72) Cited by()
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return