Abstract
Camera calibration helps users better interact with the surrounding environments. In this work, we aim at accelerating camera calibration in an indoor setting, by selecting a small but sufficient set of keypoints. Our framework consists of two phases: In the offline phase, we cluster photos labeled with Wi-Fi and gyro sensor data according to a learned distance metric. Photos in each cluster form a "co-scene". We further select a few frequently appearing keypoints in each co-scene as "useful keypoints" (UKPs). In the online phase, when a query is issued, only UKPs from the nearest co-scene are selected, and subsequently we infer extrinsic camera parameters with multiple view geometry (MVG) technique. Experimental results show that our framework is effective and efficient to support calibration.
Originalsprog | Engelsk |
---|---|
Titel | UbiComp & ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers |
Redaktører | Kenji Mase, Marc Langheinrich, Daniel Gatica-Perez, Hans Gellersen, Tanzeem Choudhury, Koji Yatani |
Antal sider | 4 |
Forlag | Association for Computing Machinery |
Publikationsdato | 2015 |
Sider | 9-12 |
ISBN (Trykt) | 978-1-4503-3575-1 |
DOI | |
Status | Udgivet - 2015 |
Udgivet eksternt | Ja |
Begivenhed | UbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing - Osaka, Japan Varighed: 7 sep. 2015 → 11 sep. 2015 |
Konference
Konference | UbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing |
---|---|
Land/Område | Japan |
By | Osaka |
Periode | 07/09/2015 → 11/09/2015 |