Abstract
Camera calibration helps users better interact with the surrounding environments. In this work, we aim at accelerating camera calibration in an indoor setting, by selecting a small but sufficient set of keypoints. Our framework consists of two phases: In the offline phase, we cluster photos labeled with Wi-Fi and gyro sensor data according to a learned distance metric. Photos in each cluster form a "co-scene". We further select a few frequently appearing keypoints in each co-scene as "useful keypoints" (UKPs). In the online phase, when a query is issued, only UKPs from the nearest co-scene are selected, and subsequently we infer extrinsic camera parameters with multiple view geometry (MVG) technique. Experimental results show that our framework is effective and efficient to support calibration.
| Original language | English |
|---|---|
| Title of host publication | UbiComp & ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers |
| Editors | Kenji Mase, Marc Langheinrich, Daniel Gatica-Perez, Hans Gellersen, Tanzeem Choudhury, Koji Yatani |
| Number of pages | 4 |
| Publisher | Association for Computing Machinery |
| Publication date | 2015 |
| Pages | 9-12 |
| ISBN (Print) | 978-1-4503-3575-1 |
| DOIs | |
| Publication status | Published - 2015 |
| Externally published | Yes |
| Event | UbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing - Osaka, Japan Duration: 7 Sept 2015 → 11 Sept 2015 |
Conference
| Conference | UbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing |
|---|---|
| Country/Territory | Japan |
| City | Osaka |
| Period | 07/09/2015 → 11/09/2015 |
Citation Styles
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver