Paper published in a book (Scientific congresses, symposiums and conference proceedings)
Deep Neural Networks for Identifying Modes of Transport Using GPS Data
HOSSEINI, Seyed Hassan; Gentile, Guido; Miristice, Lory Michelle Bresciani et al.
2024In Leonowicz, Zbigniew (Ed.) Proceedings - 24th EEEIC International Conference on Environment and Electrical Engineering and 8th I and CPS Industrial and Commercial Power Systems Europe, EEEIC/I and CPS Europe 2024
Peer reviewed
 

Files


Full Text
Deep Neural Networks for Identifying Modes of Transport Using GPS Data | IEEE Conference Publication.pdf
Author postprint (201.46 kB)
Request a copy

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
convolutional neural network; GPS Data; transport mode detection; Convolutional neural network; Data collection method; GPS data; Mode detection; Mode of transport; Neural-networks; Smart phones; Transport mode detection; Transport modes; Travel modes; Artificial Intelligence; Energy Engineering and Power Technology; Renewable Energy, Sustainability and the Environment; Electrical and Electronic Engineering; Industrial and Manufacturing Engineering; Environmental Engineering; Control and Optimization
Abstract :
[en] The use of GPS-based data collection methods has gradually replaced traditional survey-based methods in recent years as a result of the ability to extract more accurate trip data from mobile devices like smartphones. Inferring travel modes is essential to understand passengers' travel behavior and various machine and deep learning techniques have shown their ability to extract valuable details from GPS data. In this study, we introduce a convolutional neural network to extract high-level features from GPS data to recognize transport modes of trips. After preprocessing the raw GPS data to compute motion and displacement characteristics, is sliced into fixed-sized segments. Then, a Convolutional Neural Network (CNN) is used to train a model able to detect five transport modes. Our model improved the final accuracy of test data by 2.84% and outperformed previous work in predicting transit mode.
Disciplines :
Engineering, computing & technology: Multidisciplinary, general & others
Author, co-author :
HOSSEINI, Seyed Hassan ;  University of Luxembourg ; Sapienza University of Rome, Department of Civil, Constructional and Environmental Engineering, Rome, Italy
Gentile, Guido;  Sapienza University of Rome, Department of Civil, Constructional and Environmental Engineering, Rome, Italy
Miristice, Lory Michelle Bresciani;  Sapienza University of Rome, Department of Civil, Constructional and Environmental Engineering, Rome, Italy
VITI, Francesco  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Engineering (DoE)
Pourkhosro, Siavash;  Sapienza University of Rome, Department of Civil, Constructional and Environmental Engineering, Rome, Italy
External co-authors :
yes
Language :
English
Title :
Deep Neural Networks for Identifying Modes of Transport Using GPS Data
Publication date :
2024
Event name :
2024 IEEE International Conference on Environment and Electrical Engineering and 2024 IEEE Industrial and Commercial Power Systems Europe (EEEIC / I&CPS Europe)
Event place :
Rome, Italy
Event date :
17-06-2024 => 20-06-2024
Audience :
International
Main work title :
Proceedings - 24th EEEIC International Conference on Environment and Electrical Engineering and 8th I and CPS Industrial and Commercial Power Systems Europe, EEEIC/I and CPS Europe 2024
Editor :
Leonowicz, Zbigniew
Publisher :
Institute of Electrical and Electronics Engineers Inc.
ISBN/EAN :
9798350355185
Peer reviewed :
Peer reviewed
Focus Area :
Computational Sciences
Development Goals :
11. Sustainable cities and communities
Available on ORBilu :
since 29 December 2024

Statistics


Number of views
66 (1 by Unilu)
Number of downloads
0 (0 by Unilu)

Scopus citations®
 
0
Scopus citations®
without self-citations
0
OpenCitations
 
0
OpenAlex citations
 
0

Bibliography


Similar publications



Contact ORBilu