Industrial vehicles equipped with CAN bus devices transmit large volumes of IoT signals. The analysis of CAN bus data can be helpful to monitor the current vehicles’ workload, namely the vehicle duty levels, in an automated fashion. Despite the use of machine learning techniques to automatically detect vehicle duty levels is particularly appealing, existing approaches are challenged by the high cost of human data annotation and by the high heterogeneity of the analyzed vehicles types and models. In this paper, we present a self-supervised approach to automatically detect vehicles’ duty levels based on contrastive learning. The multivariate CAN Bus signals are first divided into fixed-sized segments and then embedded into a vector space shared by all vehicles of the same model by leveraging a constrastive approach with a mixup augmentation strategy. The key idea is to embed similar segments in close proximity by self-learning a model-specific clustering, which allows automatic duty level assignment with minimal human supervision. We validate the proposed approach in a real industrial use case, analyzing CAN Bus data acquired from test heavy-duty vehicles. Data were provided by a multinational Internet-of-Things company specialized in telematics solution. The experiments show clustering performance superior to state-of-the-art models and as well as an higher ability to differentiate between Moving and Working duty levels.
Detecting industrial vehicles’ duty levels using contrastive learning / Cagliero, Luca; Buccafusco, Silvia; Vaccarino, Francesco; Salvatori, Lucia; Loti, Riccardo. - STAMPA. - (2023), pp. 1630-1636. (Intervento presentato al convegno 2023 IEEE International Conference on Big Data (IEEE Big Data 2023) tenutosi a Sorrento (Italy) nel December 15-18, 2023) [10.1109/BigData59044.2023.10386964].
Detecting industrial vehicles’ duty levels using contrastive learning
Luca Cagliero;Silvia Buccafusco;Francesco Vaccarino;Lucia Salvatori;
2023
Abstract
Industrial vehicles equipped with CAN bus devices transmit large volumes of IoT signals. The analysis of CAN bus data can be helpful to monitor the current vehicles’ workload, namely the vehicle duty levels, in an automated fashion. Despite the use of machine learning techniques to automatically detect vehicle duty levels is particularly appealing, existing approaches are challenged by the high cost of human data annotation and by the high heterogeneity of the analyzed vehicles types and models. In this paper, we present a self-supervised approach to automatically detect vehicles’ duty levels based on contrastive learning. The multivariate CAN Bus signals are first divided into fixed-sized segments and then embedded into a vector space shared by all vehicles of the same model by leveraging a constrastive approach with a mixup augmentation strategy. The key idea is to embed similar segments in close proximity by self-learning a model-specific clustering, which allows automatic duty level assignment with minimal human supervision. We validate the proposed approach in a real industrial use case, analyzing CAN Bus data acquired from test heavy-duty vehicles. Data were provided by a multinational Internet-of-Things company specialized in telematics solution. The experiments show clustering performance superior to state-of-the-art models and as well as an higher ability to differentiate between Moving and Working duty levels.File | Dimensione | Formato | |
---|---|---|---|
IEEE_Big_Data_2023_Contrastive_MICENE (1).pdf
accesso aperto
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
Pubblico - Tutti i diritti riservati
Dimensione
2.1 MB
Formato
Adobe PDF
|
2.1 MB | Adobe PDF | Visualizza/Apri |
Detecting_industrial_vehicles_duty_levels_using_contrastive_learning.pdf
accesso riservato
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
2.16 MB
Formato
Adobe PDF
|
2.16 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2984774