Show simple item record

Global Map Generation and SLAM using LiDAR and Stereo Camera for tracking motion of Mobile Robot

dc.contributor.authorÁlvarez - Gutiérrez, Edwin Leonelspa
dc.contributor.authorJiménez - López, Fabián Rolandospa
dc.date2019-12-16spa
dc.date.accessioned2020-01-17T14:47:38Zspa
dc.date.available2020-01-17T14:47:38Zspa
dc.identifierhttp://revistas.ustabuca.edu.co/index.php/ITECKNE/article/view/2357spa
dc.identifier10.15332/iteckne.v16i2.2357spa
dc.identifier.urihttp://hdl.handle.net/11634/20657spa
dc.descriptionUno de los temas de mayor atención en la robótica móvil está relacionado con la localización y mapeo de un robot en un entorno determinado, y el otro, asociado a la selección de los dispositivos o sensores necesarios para adquirir la mayor cantidad de información externa posible para la generación de un mapa global. El propósito de este artículo es plantear la integración entre un robot móvil terrestre tipo oruga, tareas de SLAM con dispositivos LiDAR y el uso de estéreo visión a través de la cámara ZED para la generación de un mapa global en 2D y el seguimiento del movimiento del robot móvil mediante el software de MATLAB®. El experimento consiste en realizar diferentes pruebas de detección para determinar las distancias y hacer el seguimiento de la posición del robot móvil en un entorno estructurado en interiores, para observar el comportamiento de la plataforma móvil y determinar el error en las mediciones. Los resultados obtenidos muestran que los dispositivos integrados cumplen satisfactoriamente con las tareas establecidas en condiciones controladas y en entornos interiores, obteniendo porcentajes de error inferiores al 1 y 4% para el caso del LiDAR y la cámara ZED, respectivamente. Se desarrolló una alternativa que resuelve uno de los problemas más comunes de la robótica móvil en los últimos años y, adicionalmente, esta solución permite la posibilidad de fusionarse otro tipo de sensores como los sistemas inerciales, encoders, GPS, entre otros, con el fin de mejorar las aplicaciones en el área y la calidad de la información adquirida desde el exterior.spa
dc.descriptionOne of the topics of greatest attention in mobile robotics is related to the location and mapping of a robot in a given environment and the other, associated with the selection of the devices or sensors necessary to acquire as much external information as possible for the generation of a global map. The purpose of this article is to propose the integration between a caterpillar-type land mobile robot, SLAM tasks with LiDAR devices and the use of stereo vision through the ZED camera for the generation of a 2D global map and the tracking of the movement of the mobile robot using the MATLAB® software. The experiment consists of performing different detection tests to determine distances and track the position of mobile robot in a structured environment indoors, to observe the behavior of the mobile platform and determine the error in the measurements. The results obtained show that the integrated devices satisfactorily fulfill the tasks established in controlled conditions and in indoor environments, obtaining error percentages lower than 1 and 4% for the case of the LiDAR and the ZED camera respectively. An alternative was developed that solves one of the most common problems of mobile robotics in recent years and, additionally, this solution allows the possibility of merging other types of sensors such as inertial systems, encoders, GPS, among others, in order to improve the applications in the area and the quality of the information acquired from abroad.eng
dc.format.mimetypeapplication/pdfspa
dc.language.isoengspa
dc.publisherUniversidad Santo Tomás. Seccional Bucaramangaeng
dc.relationhttp://revistas.ustabuca.edu.co/index.php/ITECKNE/article/view/2357/1728spa
dc.relation/*ref*/D. C. Slaughter, D. K. Giles, and D. Downey, “Autonomous robotic weed control systems: A review,” Comput. Electron. Agric., vol. 61, no. 1, pp. 63-78, Apr. 2008. DOI: https://doi.org/10.1016/j.compag.2007.05.008.spa
dc.relation/*ref*/D. Ball et al. Robotics for Sustainable Broad-Acre Agriculture. In: L. Mejias, P. Corke, J. Roberts, (eds) Field and Service Robotics. Springer Tracts in Advanced Robotics, vol 105, 2015. Springer, Cham. DOI: https://doi.org/10.1007/978-3-319-07488-7_30.spa
dc.relation/*ref*/A. Barrientos et al., “Aerial remote sensing in agriculture: A practical approach to area coverage and path planning for fleets of mini aerial robots,” J. F. Robot., vol. 28, no. 5, pp. 667-689, Sep. 2011. DOI: https://doi.org/10.1002/rob.20403.spa
dc.relation/*ref*/S. G. Tzafestas and S. G. Tzafestas, “Mobile Robot Localization and Mapping,” Introd. to Mob. Robot Control, pp. 479-531, Jan. 2014. ISBN: 0124171036, 9780124171039.spa
dc.relation/*ref*/J. Park, J. Y. Kim, B. Kim, and S. Kim, “Global Map Generation using LiDAR and Stereo Camera for Initial Positioning of Mobile Robot,” in 2018 International Conference on Information and Communication Technology Robotics (ICT-ROBOT), 2018, pp. 1-4. DOI: 10.1109/ICT-ROBOT.2018.8549897.spa
dc.relation/*ref*/B. A. C. Caldato, R. A. Filho, and J. E. C. Castanho, “ORB-DOM: Stereo and odometer sensor fusion for simultaneous localization and mapping,” in 2017 Latin American Robotics Symposium (LARS) and 2017 Brazilian Symposium on Robotics (SBR), 2017, pp. 1-5. DOI: 10.1109/SBR-LARS-R.2017.8215301.spa
dc.relation/*ref*/L. Cheng, Y. Dai, R. Peng, and X. Nong, “Positioning and navigation of mobile robot with asynchronous fusion of binocular vision system and inertial navigation system,” Int. J. Adv. Robot. Syst., vol. 14, no. 6, p. 172988141774560, Nov. 2017. DOI: 10.1177/1729881417745607.spa
dc.relation/*ref*/D. T. Savaria and R. Balasubramanian, “V-SLAM: Vision-based simultaneous localization and map building for an autonomous mobile robot,” in 2010 IEEE Conference on Multisensor Fusion and Integration, 2010, pp. 1-6. DOI: 10.1109/MFI.2010.5604466.spa
dc.relation/*ref*/G. Campion, G. Bastin, and B. D’Andrca-Novel, “Structural Properties and Classification of Kinematic and Dynamic Models of Wheeled Mobile Robots,” 1996. DOI: 10.1109/ROBOT.1993.292023.spa
dc.relation/*ref*/J. L. Jones, B. A. Seiger, and A. M. Flynn, Mobile robots : inspiration to implementation. A.K. Peters, 1999. ISBN: 1568810970 / 9781568810973.spa
dc.relation/*ref*/J. Borenstein et al., “Where am I? Sensors and Methods for Mobile Robot Positioning Prepared by the University of Michigan For the Oak Ridge National Lab (ORNL) D& D Program and the United States Department of Energy’s Robotics Technology Development Program Within the Environmental Restoration, Decontamination and Dismantlement Project,” 1996.spa
dc.relation/*ref*/SLAMTEC, “RPLIDAR-A2 Laser Range Scanner_ Solid Laser Range Scanner|SLAMTEC,” 2018. [Online]. Available: https://www.slamtec.com/en/Lidar/A2. [Accessed: 14-Feb-2019].spa
dc.relation/*ref*/STEREOLABS, “Stereolabs - Capture the World in 3D,” 2018. [Online]. Available: https://www.stereolabs.com/. [Accessed: 14-Feb-2019].spa
dc.relation/*ref*/Dinesh Nair, “A Guide to Stereovision and 3D Imaging - Tech Briefs.” [Online]. Available: https://www.techbriefs.com/component/content/article/tb/features/articles/14925?start=1. [Accessed: 11-Apr-2019].spa
dc.relation/*ref*/MaxBotix, “How to Use an Ultrasonic Sensor with Arduino [With Code Examples],” 2017. [Online]. Available: https://www.maxbotix.com/Arduino-Ultrasonic-Sensors-085/.[Accessed: 14-Feb-2019].spa
dc.relation/*ref*/Edwin Leonel Álvarez Gutiérrez, “Red ZigBee para la Medición de Variables Físicas con Interfaz en Arduino-MATLAB,” I3+, vol. 3, p. 50–64 p., 2016. DOI: https://doi.org/10.24267/23462329.218.spa
dc.relation/*ref*/A. F. Silva-Bohórquez, L. E. Mendoza, and C. A. Peña-Cortés, “Sistema de inspección y vigilancia utilizando un robot aéreo guiado mediante visión artificial,” ITECKNE, vol. 10, no. 2, pp. 190-198, Feb. 2014. DOI: https://doi.org/10.15332/iteckne.v10i2.421.spa
dc.rightsCopyright (c) 2019 ITECKNEeng
dc.sourceITECKNE; Vol 16, No 2 (2019); 58-70eng
dc.sourceITECKNE; Vol 16, No 2 (2019); 58-70spa
dc.source2339-3483spa
dc.source1692-1798spa
dc.titleGeneración de Mapa Global 2D y SLAM usando LiDAR y una Estéreo Cámara para el seguimiento de movimiento de un robot móvilspa
dc.titleGlobal Map Generation and SLAM using LiDAR and Stereo Camera for tracking motion of Mobile Roboteng
dc.typeinfo:eu-repo/semantics/articlespa
dc.typeinfo:eu-repo/semantics/publishedVersionspa
dc.subject.proposalLiDAR; Mapa Global; seguimiento de movimiento; SLAM; robot móvil; visión estéreospa
dc.subject.proposalLiDAR; Global Map; motion tracking; SLAM; mobile robot; stereo visioneng


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Indexado por: