E^2C^2: efficient and effective camera calibration in indoor environments

  • Huan Li
  • , Pai Peng
  • , Hua Lu
  • , Lidan Shou
  • , Ke Chen
  • , Gang Chen

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

Camera calibration helps users better interact with the surrounding environments. In this work, we aim at accelerating camera calibration in an indoor setting, by selecting a small but sufficient set of keypoints. Our framework consists of two phases: In the offline phase, we cluster photos labeled with Wi-Fi and gyro sensor data according to a learned distance metric. Photos in each cluster form a "co-scene". We further select a few frequently appearing keypoints in each co-scene as "useful keypoints" (UKPs). In the online phase, when a query is issued, only UKPs from the nearest co-scene are selected, and subsequently we infer extrinsic camera parameters with multiple view geometry (MVG) technique. Experimental results show that our framework is effective and efficient to support calibration.
Original languageEnglish
Title of host publicationUbiComp & ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers
EditorsKenji Mase, Marc Langheinrich, Daniel Gatica-Perez, Hans Gellersen, Tanzeem Choudhury, Koji Yatani
Number of pages4
PublisherAssociation for Computing Machinery
Publication date2015
Pages9-12
ISBN (Print)978-1-4503-3575-1
DOIs
Publication statusPublished - 2015
Externally publishedYes
EventUbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing - Osaka, Japan
Duration: 7 Sept 201511 Sept 2015

Conference

ConferenceUbiComp '15: The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing
Country/TerritoryJapan
CityOsaka
Period07/09/201511/09/2015

Citation Styles