DOI:

https://doi.org/10.14483/23448393.16562

Publicado:

2021-05-30

Número:

Vol. 26 Núm. 2 (2021): Mayo-Agosto

Sección:

Ingeniería Eléctrica y Electrónica

Vision-based Software Tool System for Position Estimation Using a Smartphone

Sistema de herramientas software basado en visión para la estimación de posición usando un teléfono inteligente

Autores/as

Palabras clave:

Smartphone, Position estimation, Computer vision (en).

Palabras clave:

Teléfono inteligente, estimación de posición, visión por computador (es).

Descargas

Resumen (en)

Context: Current smartphones models have a very interesting set of sensors, such as cameras, IMUs, GPS, and environmental variables. This combination of sensors motivates the use of smartphones in scientific and service applications. One of these applications is precision agriculture, specifically drone position estimation using computer vision in GPS-denied environments for remote crop measurements.

Method: This work presents the development of EVP, a vision-based position estimation system using a modern smartphone and computer vision methods. EVP consists of two software applications: an Android app (mobile station) running on a smartphone capable of controlling the drone’s flight, acquiring images, and transmitting them through a wireless network; and another application (base station) running on a Linux-based computer capable of receiving the images, processing them and running the position estimation algorithms using the acquired images.  In this work, the mobile station is placed in a quadcopter. Using EVP, users can configure the mobile and base station software, execute the vision-based position estimation method, observe position graph results on the base station, and store sensor data in a database.

Results:  EVP was tested in three field tests: an indoor environment, an open field flight, and a field test over the Engineering Department’s square at Universidad del Valle. The root mean square errors obtained in XY were 0,166 m, 2,8 m, and 1,4 m, respectively, and they were compared against the GPS-RTK measurements.

Conclusions: As a result, a vision-based position estimation system called EVP was developed and tested in real-world experiments. This system can be used in GPS-denied environments to perform tasks such as 3D mapping, pick-up and delivery of goods, object tracking, among others.

Acknowledgements: This work was partially funded by the research project “Autonomous Aerial System to Map the Nitrogen Contents in Crops using Micro-Spectral Sensors”, contract CI2961 of Universidad del Valle.

Referencias

S. Bianco, G. Ciocca, and D. Marelli, “Evaluating the Performance of Structure from Motion Pipelines,” J. Imaging, vol. 4, no. 8, p. 98, Aug. 2018. https://doi.org/10.3390/jimaging4080098

M. O. A. Aqel, M. H. Marhaban, M. I. Saripan, and N. B. Ismail, “Review of visual odometry: types, approaches, challenges, and applications,” Springerplus, vol. 5, no. 1, p. 1897, Dec. 2016. https://doi.org/10.1186/s40064-016-3573-7

H. Durrant-whyte and T. Bailey, “Simultaneous Localisation and Mapping ( SLAM ): Part I The Essential Algorithms,” IEEE Robot. Autom. Mag., vol. 13, no. 2, pp. 99–110, 2006. https://doi.org/10.1109/MRA.2006.1638022

A. Astudillo, B. Bacca, and E. Rosero, “Optimal and robust controllers design for a smartphone-based quadrotor,” in 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC), 2017, pp. 1–6. https://doi.org/10.1109/CCAC.2017.8276392

G. Klein and D. Murray, “Parallel Tracking and Mapping on a camera phone,” in 2009 8th IEEE International Symposium on Mixed and Augmented Reality, 2009, pp. 83–86. https://doi.org/10.1109/ISMAR.2009.5336495

J. Ventura, C. Arth, G. Reitmayr, and D. Schmalstieg, “Global Localization from Monocular SLAM on a Mobile Phone,” IEEE Trans. Vis. Comput. Graph., vol. 20, no. 4, pp. 531–539, Apr. 2014. https://doi.org/10.1109/TVCG.2014.27

M. Cao, L. Zheng, W. Jia, and X. Liu, “Fast monocular visual odometry for augmented reality on smartphones,” IEEE Consum. Electron. Mag., pp. 1–1, 2020. https://doi.org/10.1109/MCE.2020.2993086

L. Jin, H. Zhang, and C. Ye, “Camera Intrinsic Parameters Estimation by Visual–Inertial Odometry for a Mobile Phone With Application to Assisted Navigation,” IEEE/ASME Trans. Mechatronics, vol. 25, no. 4, pp. 1803–1811, Aug. 2020. https://doi.org/10.1109/TMECH.2020.2997606

S. Tomažič and I. Škrjanc, “Monocular Visual Odometry on a Smartphone,” IFAC-PapersOnLine, vol. 48, no. 10, pp. 227–232, Jan. 2015. https://doi.org/10.1016/j.ifacol.2015.08.136

F. G and C. JM, “Indoor Localization using Computer Vision and Visual-Inertial Odometry.,” Comput. Help. People with Spec. Needs ... Int. Conf. ICCHP ... proceedings. Int. Conf. Comput. Help. People with Spec. Needs, vol. 10897, pp. 86–93, Jun. 2018. https://doi.org/10.1007/978-3-319-94274-2_13

D. Agudelo España, D. Gil López, and S. Silva López, “Estimación de la localización de un vehículo usando un sistema de visión por computador,” Universidad Tecnológica de Pereira, Grupo de Investigación SIRIUS, 2013.

T. Schops, J. Engel, and D. Cremers, “Semi-dense visual odometry for AR on a smartphone,” in 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2014, pp. 145–150. https://doi.org/10.1109/ISMAR.2014.6948420

M. Li, R. Chen, X. Liao, B. Guo, W. Zhang, and G. Guo, “A Precise Indoor Visual Positioning Approach Using a Built Image Feature Database and Single User Image from Smartphone Cameras,” Remote Sens., vol. 12, no. 5, p. 869, Mar. 2020. https://doi.org/10.3390/rs12050869

Y.-H. Jin, K.-W. Ko, and W.-H. Lee, “An Indoor Location-Based Positioning System Using Stereo Vision with the Drone Camera,” Mob. Inf. Syst., vol. 2018, pp. 1–13, Oct. 2018. https://doi.org/10.1155/2018/5160543

Z. Shang and Z. Shen, “Vision-model-based Real-time Localization of Unmanned Aerial Vehicle for Autonomous Structure Inspection under GPS-denied Environment,” in The 2019 ASCE International Conference on Computing in Civil Engineering, 2019, pp. 1–8. https://doi.org/10.1061/9780784482445.037

M. Werner, M. Kessel, and C. Marouane, “Indoor positioning using smartphone camera,” in 2011 International Conference on Indoor Positioning and Indoor Navigation, 2011, pp. 1–6. https://doi.org/10.1109/IPIN.2011.6071954

F. Jabborov and J. Cho, “Image-Based Camera Localization Algorithm for Smartphone Cameras Based on Reference Objects,” Wirel. Pers. Commun., vol. 114, no. 3, pp. 2511–2527, Oct. 2020. https://doi.org/10.1007/s11277-020-07487-9

R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Trans. Robot., vol. 31, no. 5, pp. 1147–1163, Oct. 2015. https://doi.org/10.1109/TRO.2015.2463671

E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, “ORB: An efficient alternative to SIFT or SURF,” 2011 Int. Conf. Comput. Vis., pp. 2564–2571, 2011. https://doi.org/10.1109/ICCV.2011.6126544

P. Kruchten, The Rational Unified Process: An Introduction, 3rd ed. Addison-Wesley Professional, 2003.

J.-Y. Bouguet, “Camera Calibration Toolbox for Matlab,” California Institute of Technology, 2013. [Online]. Available: http://www.vision.caltech.edu/bouguetj/calib_doc/

M. Karaim, M. Elsheikh, and A. Noureldin, “GNSS Error Sources,” in Multifunctional Operation and Application of GPS, InTech, 2018. https://doi.org/10.5772/intechopen.75493

Cómo citar

APA

Urbano-López, J., Bacca Cortes, E. B., & Buitrago-Molina, J. (2021). Vision-based Software Tool System for Position Estimation Using a Smartphone. Ingeniería, 26(2). https://doi.org/10.14483/23448393.16562

ACM

[1]
Urbano-López, J., Bacca Cortes, E.B. y Buitrago-Molina, J. 2021. Vision-based Software Tool System for Position Estimation Using a Smartphone. Ingeniería. 26, 2 (may 2021). DOI:https://doi.org/10.14483/23448393.16562.

ACS

(1)
Urbano-López, J.; Bacca Cortes, E. B.; Buitrago-Molina, J. Vision-based Software Tool System for Position Estimation Using a Smartphone. Ing. 2021, 26.

ABNT

URBANO-LÓPEZ, J.; BACCA CORTES, E. B.; BUITRAGO-MOLINA, J. Vision-based Software Tool System for Position Estimation Using a Smartphone. Ingeniería, [S. l.], v. 26, n. 2, 2021. DOI: 10.14483/23448393.16562. Disponível em: https://revistas.udistrital.edu.co/index.php/reving/article/view/16562. Acesso em: 26 sep. 2021.

Chicago

Urbano-López, Julio, Eval Bladimir Bacca Cortes, y José Buitrago-Molina. 2021. «Vision-based Software Tool System for Position Estimation Using a Smartphone». Ingeniería 26 (2). https://doi.org/10.14483/23448393.16562.

Harvard

Urbano-López, J., Bacca Cortes, E. B. y Buitrago-Molina, J. (2021) «Vision-based Software Tool System for Position Estimation Using a Smartphone», Ingeniería, 26(2). doi: 10.14483/23448393.16562.

IEEE

[1]
J. Urbano-López, E. B. Bacca Cortes, y J. Buitrago-Molina, «Vision-based Software Tool System for Position Estimation Using a Smartphone», Ing., vol. 26, n.º 2, may 2021.

MLA

Urbano-López, J., E. B. Bacca Cortes, y J. Buitrago-Molina. «Vision-based Software Tool System for Position Estimation Using a Smartphone». Ingeniería, vol. 26, n.º 2, mayo de 2021, doi:10.14483/23448393.16562.

Turabian

Urbano-López, Julio, Eval Bladimir Bacca Cortes, y José Buitrago-Molina. «Vision-based Software Tool System for Position Estimation Using a Smartphone». Ingeniería 26, no. 2 (mayo 30, 2021). Accedido septiembre 26, 2021. https://revistas.udistrital.edu.co/index.php/reving/article/view/16562.

Vancouver

1.
Urbano-López J, Bacca Cortes EB, Buitrago-Molina J. Vision-based Software Tool System for Position Estimation Using a Smartphone. Ing. [Internet]. 30 de mayo de 2021 [citado 26 de septiembre de 2021];26(2). Disponible en: https://revistas.udistrital.edu.co/index.php/reving/article/view/16562

Descargar cita

Visitas

166

Dimensions


PlumX


Descargas

Los datos de descargas todavía no están disponibles.

Artículos más leídos del mismo autor/a