Fully automatic extrinsic calibration of RGB-D system using two views of natural scene - Interactions, Réalité Augmentée, Robotique Ambiante Accéder directement au contenu
Communication Dans Un Congrès Année : 2014

Fully automatic extrinsic calibration of RGB-D system using two views of natural scene

Résumé

RGB-D sensor, like low cost Kinect, are widely used in robotics applications. Obstacle Avoidance (OA), Simultaneous Localization And Mapping (SLAM), Mobile Object Tracking (MOT) all are needing accurate information about the position of objects in the environment. 3D cameras are really convenient to realize those tasks but as low cost sensors, they have to be completed by other sensors:cameras, laser range finders, US or IR telemeters. In order to exploit all data sensors in a same algorithm, we have to express these data in a common reference frame. In other words we have to know the rigid transformation between sensor frames. In this paper, we propose a new method to retrieve rigid transformation (known as extrinsic parameters in calibration process) between a depth camera and a conventional camera. We show that such a method is accurate enough without the need of an user interaction nor a special calibration pattern unlike other common calibration processes.
Fichier non déposé

Dates et versions

hal-01145890 , version 1 (27-04-2015)

Identifiants

Citer

Jean-Clement Devaux, Hicham Hadj-Abdelkader, Etienne Colle. Fully automatic extrinsic calibration of RGB-D system using two views of natural scene. 13th International Conference on Control Automation Robotics and Vision (ICARCV 2014), Dec 2014, Singapour, Singapore. pp.894--900, ⟨10.1109/ICARCV.2014.7064423⟩. ⟨hal-01145890⟩
96 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More