Realtime Tracking of Passengers on the London Underground Transport by Matching Smartphone Accelerometer Footprints. / Nguyen, Khuong An; Wang, You; Li, Guang; Luo, Zhiyuan; Watkins, Chris.

In: Sensors (Basel, Switzerland), Vol. 19, No. 19, 4184, 26.09.2019, p. 1-26.

Research output: Contribution to journalArticle

Published

Abstract

Passengers travelling on the London underground tubes currently have no means of knowing their whereabouts between stations. The challenge for providing such service is that the London underground tunnels have no GPS, Wi-Fi, Bluetooth, or any kind of terrestrial signals to leverage. This paper presents a novel yet practical idea to track passengers in realtime using the smartphone accelerometer and a training database of the entire London underground network. Our rationales are that London tubes are self-driving transports with predictable accelerations, decelerations, and travelling time and that they always travel on the same fixed rail lines between stations with distinctive bumps and vibrations, which permit us to generate an accelerometer map of the tubes’ movements on each line. Given the passenger’s accelerometer data, we identify in realtime what line they are travelling on and what station they depart from, using a pattern-matching algorithm, with an accuracy of up to about 90% when the sampling length is equivalent to at least 3 station stops. We incorporate Principal Component Analysis to perform inertial tracking of passengers’ positions along the line when trains break away from scheduled movements during rush hours. Our proposal was painstakingly assessed on the entire London underground, covering approximately 940 km of travelling distance, spanning across 381 stations on 11 different lines.
Original languageEnglish
Article number4184
Pages (from-to)1-26
Number of pages26
JournalSensors (Basel, Switzerland)
Volume19
Issue number19
DOIs
Publication statusPublished - 26 Sep 2019
This open access research output is licenced under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

ID: 34814256