Abstract

The pose estimation is increasingly attracting attention in research fields such as constrained guidance and control, robotics, and communication technology. In the extreme environment of space, existing spacecraft pose estimation methods are not mature. In this regard, this paper introduces a spacecraft quadrilateral pose estimation method based on deep learning and keypoint filtering, specifically designed for spacecraft with coplanar features. A two-stage neural network is employed to detect and extract features from the spacecraft’s solar panels, generating a heatmap of 2D keypoints. Geometric constraint equations are formulated based on the homographic relationship between the solar panels and the image plane, yielding the spacecraft’s rough pose through the solution of these equations. The predicted confidence of 2D keypoints and rough pose are utilized to construct a pixel error loss function for keypoint filtering. The refined pose is obtained by optimizing this loss function. Extensive experiments are conducted using commonly used spacecraft pose estimation data sets, demonstrating the effectiveness of the proposed method.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

Some or all data, models, or code that support the findings of this study are available from the corresponding author upon reasonabe request. The data presented in this study are openly available from the following three websites. SPEC2019 Data set: https://kelvins.esa.int/satellite-pose-estimation-challenge/data/. SPEC2021 Data set: https://kelvins.esa.int/pose-estimation-2021/data/. URSO Soyuz Data set: https://github.com/pedropro/UrsoNet?tab=readme-ov-file.

Acknowledgments

Our research originates from the Aircraft Vision Perception Team of Aeronautics and Astronautics of Sun Yat-Sen University and National Innovation Institute of Defense Technology, Academy of Military Science. The research was conducted under the guidance of Professor Xiaohu Zhang. We would like to express our gratitude to the sponsoring institutions and our mentors for their invaluable support and guidance.

References

Ansdell, M., P. Ehrenfreund, and C. McKay. 2011. “Stepping stones toward global space exploration.” Acta Astronaut. 68 (11–12): 2098–2113. https://doi.org/10.1016/j.actaastro.2010.10.025.
Bay, H., T. Tuytelaars, and L. Van Gool. 2006. “Surf: Speeded up robust features.” In Proc., Computer Vision–ECCV 2006: 9th European Conf. on Computer Vision, Graz, Austria, May 7-13, 2006. Proc., Part I 9, 404–417. Cham, Switzerland: Springer.
Capuano, V., A. Harvard, and S.-J. Chung. 2022. “On-board cooperative spacecraft relative navigation fusing gnss with vision.” Prog. Aerosp. Sci. 128 (Jan): 100761. https://doi.org/10.1016/j.paerosci.2021.100761.
Chen, B., J. Cao, A. Parra, and T.-J. Chin. 2019. “Satellite pose estimation with deep landmark regression and nonlinear pose refinement.” In Proc., IEEE/CVF Int. Conf. on Computer Vision Workshops. New York: IEEE.
Ding, Y., Z. Guo, S. Cao, J. Guo, and T. R. Oliveira. 2024. “Integrated guidance and control scheme for unmanned aerial vehicle with unknown control coefficient accommodating approach angle and field-of-view constraints.” J. Aerosp. Eng. 37 (3): 04024008. https://doi.org/10.1061/JAEEEZ.ASENG-5147.
Fischler, M. A., and R. C. Bolles. 1981. “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography.” Commun. ACM 24 (6): 381–395. https://doi.org/10.1145/358669.358692.
Harris, C., and M. Stephens. 1988. “A combined corner and edge detector.” In Vol. 15 of Alvey Vision Conf., 10–5244. Princeton, NJ: Citeseer.
Hatty, I. 2022. “Viability of on-orbit servicing spacecraft to prolong the operational life of satellites.” J. Space Saf. Eng. 9 (2): 263–268. https://doi.org/10.1016/j.jsse.2022.02.011.
Hu, Y., S. Speierer, W. Jakob, P. Fua, and M. Salzmann. 2021. “Wide-depth-range 6d object pose estimation in space.” In Proc., IEEE/CVF Conf. on Computer Vision and Pattern Recognition, 15870–15879. New York: IEEE.
Huang, H., B. Song, G. Zhao, and Y. Bo. 2023. “End-to-end monocular pose estimation for uncooperative spacecraft based on direct regression network.” IEEE Trans. Aerosp. Electron. Syst. 59 (5): 6954–6970. https://doi.org/10.1109/TAES.2023.3281305.
Kahan, W. 1966. “Numerical linear algebra.” Can. Math. Bull. 9 (5): 757–801. https://doi.org/10.4153/CMB-1966-083-2.
Ke, T., and S. I. Roumeliotis. 2017. “An efficient algebraic solution to the perspective-three-point problem.” In Proc., IEEE Conf. on Computer Vision and Pattern Recognition, 7225–7233. New York: IEEE.
Kisantal, M., S. Sharma, T. H. Park, D. Izzo, M. Märtens, and S. D’Amico. 2020. “Satellite pose estimation challenge: Data set, competition design, and results.” IEEE Trans. Aerosp. Electron. Syst. 56 (5): 4083–4098. https://doi.org/10.1109/TAES.2020.2989063.
Lepetit, V., F. Moreno-Noguer, and P. Fua. 2009. “Epnp: An accurate o(n) solution to the pnp problem.” Int. J. Comput. Vis. 81 (Feb): 155–166. https://doi.org/10.1007/s11263-008-0152-6.
Lin, T.-Y., M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár, and C. L. Zitnick. 2014. “Microsoft coco: Common objects in context.” In Proc., Computer Vision–ECCV 2014: 13th European Conf., Zurich, Switzerland, September 6-12, 2014, Proc., Part V 13, 740–755. Cham, Switzerland: Springer.
Long, C., and Q. Hu. 2022. “Monocular-vision-based relative pose estimation of noncooperative spacecraft using multicircular features.” IEEE/ASME Trans. Mechatron. 27 (6): 5403–5414. https://doi.org/10.1109/TMECH.2022.3181681.
Lowe, D. G. 2004. “Distinctive image features from scale-invariant keypoints.” Int. J. Comput. Vis. 60 (Nov): 91–110. https://doi.org/10.1023/B:VISI.0000029664.99615.94.
Lu, C., S. Xia, M. Shao, and Y. Fu. 2019. “Arc-support line segments revisited: An efficient high-quality ellipse detection.” IEEE Trans. Image Process. 29 (Aug): 768–781. https://doi.org/10.1109/TIP.2019.2934352.
Madsen, K., H. B. Nielsen, and O. Tingleff. 2004. Methods for non-linear least squares problems. Amsterdam, Netherlands: Elsevier.
Maji, D., S. Nagori, M. Mathew, and D. Poddar. 2022. “Yolo-pose: Enhancing yolo for multi person pose estimation using object keypoint similarity loss.” In Proc., IEEE/CVF Conf. on Computer Vision and Pattern Recognition, 2637–2646. New York: IEEE.
Malis, E., and M. Vargas Villanueva. 2007. Deeper understanding of the homography decomposition for vision-based control. Sevilla, Spain: Universidad de Sevilla.
Meng, C., Z. Li, X. Bai, and F. Zhou. 2020. “Arc adjacency matrix-based fast ellipse detection.” IEEE Trans. Image Process. 29 (Jan): 4406–4420. https://doi.org/10.1109/TIP.2020.2967601.
Park, T. H., M. Märtens, M. Jawaid, Z. Wang, B. Chen, T.-J. Chin, D. Izzo, and S. D’Amico. 2023. “Satellite pose estimation competition 2021: Results and analyses.” Acta Astronaut. 204 (Mar): 640–665. https://doi.org/10.1016/j.actaastro.2023.01.002.
Park, T. H., M. Märtens, G. Lecuyer, D. Izzo, and S. D’Amico. 2022. “Speed+: Next-generation data set for spacecraft pose estimation across domain gap.” In Proc., 2022 IEEE Aerospace Conf. (AERO), 1–15. New York: IEEE.
Peldszus, R., and P. Faucher. 2022. “European union space surveillance & tracking (EU SST): State of play and perspectives.” Space Policy 62 (Nov): 101503. https://doi.org/10.1016/j.spacepol.2022.101503.
Penna, M. A. 1991. “Determining camera parameters from the perspective projection of a quadrilateral.” Pattern Recognit. 24 (6): 533–541. https://doi.org/10.1016/0031-3203(91)90019-2.
Pérez-Villar, J. I. B., Á. García-Martín, and J. Bescós. 2022. “Spacecraft pose estimation based on unsupervised domain adaptation and on a 3d-guided loss combination.” In Proc., European Conf. on Computer Vision, 37–52. Cham, Switzeland: Springer.
Proença, P. F., and Y. Gao. 2020. “Deep learning for spacecraft pose estimation from photorealistic rendering.” In Proc., 2020 IEEE Int. Conf. on Robotics and Automation (ICRA), 6007–6013. New York: IEEE.
Rathinam, A., and Y. Gao. 2020. On-orbit relative navigation near a known target using monocular vision and convolutional neural networks for pose estimation. Guildford, UK: Univ. of Surrey.
Rublee, E., V. Rabaud, K. Konolige, and G. Bradski. 2011. “Orb: An efficient alternative to sift or surf.” In Proc., 2011 Int. Conf. on Computer Vision, 2564–2571. New York: IEEE.
Sharma, S. 2016. “Comparative assessment of techniques for initial pose estimation using monocular vision.” Acta Astronaut. 123 (Jun): 435–445. https://doi.org/10.1016/j.actaastro.2015.12.032.
Sharma, S., C. Beierle, and S. D’Amico. 2018. “Pose estimation for non-cooperative spacecraft rendezvous using convolutional neural networks.” In Proc., 2018 IEEE Aerospace Conf., 1–12. New York: IEEE.
Sharma, S., and S. D’Amico. 2019. “Pose estimation for non-cooperative rendezvous using neural networks.” Preprint, submitted June 24, 2019. https://arxiv.org/abs/1906.09868.
Shi, L., X. Xiao, M. Shan, and X. Wang. 2022. “Force control of a space robot in on-orbit servicing operations.” Acta Astronaut. 193 (Apr): 469–482. https://doi.org/10.1016/j.actaastro.2022.01.015.
Song, J., and C. Cao. 2015. “Pose self-measurement of noncooperative spacecraft based on solar panel triangle structure.” J. Rob. 2015 (Jan): 472461. https://doi.org/10.1155/2015/472461.
Sun, K., B. Xiao, D. Liu, and J. Wang. 2019. “Deep high-resolution representation learning for human pose estimation.” In Proc., IEEE/CVF Conf. on Computer Vision and Pattern Recognition, 5693–5703. New York: IEEE.
Tang, J., N. Xie, K. Li, Y. Liang, and X. Shen. 2024. “Trajectory tracking control for fixed-wing UAV based on DDPG.” J. Aerosp. Eng. 37 (3): 04024012. https://doi.org/10.1061/JAEEEZ.ASENG-5286.
Tekin, B., S. N. Sinha, and P. Fua. 2018. “Real-time seamless single shot 6d object pose prediction.” In Proc., IEEE Conf. on Computer Vision and Pattern Recognition, 292–301. New York: IEEE.
Ulmer, M., M. Durner, M. Sundermeyer, M. Stoiber, and R. Triebel. 2023. “6D object pose estimation from approximate 3d models for orbital robotics.” In Proc., 2023  IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 10749–10756. New York: IEEE.
Wang, C.-Y., A. Bochkovskiy, and H.-Y. M. Liao. 2023a. “Yolov7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors.” In Proc., IEEE/CVF Conf. on Computer Vision and Pattern Recognition, 7464–7475. New York: IEEE.
Wang, W., G. Wang, C. Hu, and K. Ho. 2023b. “Robust ellipse fitting based on maximum correntropy criterion with variable center.” In IEEE transactions on image processing, 2520–2535. New York: IEEE.
Wang, Z., M. Chen, Y. Guo, Z. Li, and Q. Yu. 2023c. “Bridging the domain gap in satellite pose estimation: A self-training approach based on geometrical constraints.” In IEEE transactions on aerospace and electronic systems, 1–14. New York: IEEE.
Wang, Z., Z. Zhang, X. Sun, Z. Li, and Q. Yu. 2022. “Revisiting monocular satellite pose estimation with transformer.” IEEE Trans. Aerosp. Electron. Syst. 58 (5): 4279–4294. https://doi.org/10.1109/TAES.2022.3161605.
Wu, J., Y. Zheng, Z. Gao, Y. Jiang, X. Hu, Y. Zhu, J. Jiao, and M. Liu. 2022. “Quadratic pose estimation problems: Globally optimal solutions, solvability/observability analysis, and uncertainty description.” IEEE Trans. Rob. 38 (5): 3314–3335. https://doi.org/10.1109/TRO.2022.3155880.
Xu, Y., J. Zhang, Q. Zhang, and D. Tao. 2022. “Vitpose: Simple vision transformer baselines for human pose estimation.” Adv. Neural Inf. Process. Syst. 35 (Dec): 38571–38584.
Yingxiao, L., H. Ju, M. Ping, and R. Jiang. 2022. “Target localization method of non-cooperative spacecraft on on-orbit service.” Chin. J. Aeronaut. 35 (11): 336–348. https://doi.org/10.1016/j.cja.2022.04.001.
Zhang, Y., X. Li, H. Liu, and Y. Shang. 2016. “Probabilistic approach for maximum likelihood estimation of pose using lines.” IET Comput. Vision 10 (6): 475–482. https://doi.org/10.1049/iet-cvi.2015.0099.
Zheng, Z., P. Wang, W. Liu, J. Li, R. Ye, and D. Ren. 2020. “Distance-IoU loss: Faster and better learning for bounding box regression.” In Vol. 34 of Proc., AAAI Conf. on Artificial Intelligence, 12993–13000. Washington, DC: Association for the Advancement of Artificial Intelligence.

Information & Authors

Information

Published In

Go to Journal of Aerospace Engineering
Journal of Aerospace Engineering
Volume 37Issue 5September 2024

History

Received: Jan 26, 2024
Accepted: Mar 22, 2024
Published online: Jul 10, 2024
Published in print: Sep 1, 2024
Discussion open until: Dec 10, 2024

Permissions

Request permissions for this article.

ASCE Technical Topics:

Authors

Affiliations

The School of Aeronautics and Astronautics, Sun Yat-sen Univ., Shenzhen, GuangDong 528406, China. ORCID: https://orcid.org/0009-0007-0558-3275. Email: [email protected]
Researcher, National Innovation Institute of Defense Technology, Academy of Military Science, No. 53 Fengtai East St., Beijing 100071, China. ORCID: https://orcid.org/0000-0001-5592-0558. Email: [email protected]
Jie Wang, Ph.D., S.M.ASCE [email protected]
The School of Aeronautics and Astronautics, Sun Yat-sen Univ., Shenzhen, GuangDong 528406, China. Email: [email protected]
Xiangpeng Xu, Ph.D., S.M.ASCE [email protected]
The School of Aeronautics and Astronautics, Sun Yat-sen Univ., Shenzhen, GuangDong 528406, China. Email: [email protected]
Ling Meng, Aff.M.ASCE [email protected]
Researcher, National Innovation Institute of Defense Technology, Academy of Military Science, No. 53 Fengtai East St., Beijing 100071, China. Email: [email protected]
Xiaohu Zhang, Aff.M.ASCE [email protected]
Professor, The School of Aeronautics and Astronautics, Sun Yat-sen Univ., Shenzhen, GuangDong 528406, China (corresponding author). Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share