E^2C^2: efficient and effective camera calibration in indoor environments

Huan Li, Pai Peng, Hua Lu, Lidan Shou, Ke Chen, Gang Chen

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningpeer review

Abstract

Camera calibration helps users better interact with the surrounding environments. In this work, we aim at accelerating camera calibration in an indoor setting, by selecting a small but sufficient set of keypoints. Our framework consists of two phases: In the offline phase, we cluster photos labeled with Wi-Fi and gyro sensor data according to a learned distance metric. Photos in each cluster form a "co-scene". We further select a few frequently appearing keypoints in each co-scene as "useful keypoints" (UKPs). In the online phase, when a query is issued, only UKPs from the nearest co-scene are selected, and subsequently we infer extrinsic camera parameters with multiple view geometry (MVG) technique. Experimental results show that our framework is effective and efficient to support calibration.
OriginalsprogEngelsk
TitelUbiComp & ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers
RedaktørerKenji Mase, Marc Langheinrich, Daniel Gatica-Perez, Hans Gellersen, Tanzeem Choudhury, Koji Yatani
Antal sider4
ForlagAssociation for Computing Machinery
Publikationsdato2015
Sider9-12
ISBN (Trykt)978-1-4503-3575-1
DOI
StatusUdgivet - 2015
Udgivet eksterntJa
BegivenhedUbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing - Osaka, Japan
Varighed: 7 sep. 201511 sep. 2015

Konference

KonferenceUbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing
Land/OmrådeJapan
ByOsaka
Periode07/09/201511/09/2015

Citer dette