In recent years, the urgent need to fully exploit the fuel economy potential of Electrified Vehicles (xEVs) through the optimal design of their Energy Management System (EMS) has led to an increasing interest in Machine Learning (ML) techniques. Among them, Reinforcement Learning (RL) seems to be one of the most promising approaches thanks to its peculiar structure in which an agent learns the optimal control strategy by interacting directly with an environment, making decisions, and receiving feedback in the form of rewards. Therefore, in this study, a new Soft Actor-Critic (SAC) agent, which exploits a stochastic policy, was implemented on a digital twin of a state-of-the-art diesel Plug-in Hybrid Electric Vehicle (PHEV) available on the European market. The SAC agent was trained to enhance the fuel economy of the PHEV while guaranteeing its battery charge sustainability. The proposed control strategy's potential was first assessed on the Worldwide harmonized Light-duty vehicles Test Cycle (WLTC) and benchmarked against a Dynamic Programming (DP) optimization to evaluate the performance of two different rewards. Then, the best-performing agent was tested on two additional driving cycles taken from the Environmental Protection Agency (EPA) regulatory framework: the Federal Test Procedure-75 (FTP75) and the Highway Fuel Economy Test (HFET), representative of urban and highway driving scenarios, respectively. The best-performing SAC model achieved results close to the DP reference on the WLTC, with a limited gap (lower than 9%) in terms of fuel consumption over all the testing cycles.
Development of a Soft-Actor Critic Reinforcement Learning Algorithm for the Energy Management of a Hybrid Electric Vehicle / Rolando, Luciano; Campanelli, Nicola; Tresca, Luigi; Pulvirenti, Luca; Millo, Federico. - In: SAE TECHNICAL PAPER. - ISSN 0148-7191. - 1:(2024). (Intervento presentato al convegno CO2 Reduction for Transportation Systems Conference tenutosi a Torino, Italy nel giugno 2024) [10.4271/2024-37-0011].
Development of a Soft-Actor Critic Reinforcement Learning Algorithm for the Energy Management of a Hybrid Electric Vehicle
Rolando, Luciano;Campanelli, Nicola;Tresca, Luigi;Pulvirenti, Luca;Millo, Federico
2024
Abstract
In recent years, the urgent need to fully exploit the fuel economy potential of Electrified Vehicles (xEVs) through the optimal design of their Energy Management System (EMS) has led to an increasing interest in Machine Learning (ML) techniques. Among them, Reinforcement Learning (RL) seems to be one of the most promising approaches thanks to its peculiar structure in which an agent learns the optimal control strategy by interacting directly with an environment, making decisions, and receiving feedback in the form of rewards. Therefore, in this study, a new Soft Actor-Critic (SAC) agent, which exploits a stochastic policy, was implemented on a digital twin of a state-of-the-art diesel Plug-in Hybrid Electric Vehicle (PHEV) available on the European market. The SAC agent was trained to enhance the fuel economy of the PHEV while guaranteeing its battery charge sustainability. The proposed control strategy's potential was first assessed on the Worldwide harmonized Light-duty vehicles Test Cycle (WLTC) and benchmarked against a Dynamic Programming (DP) optimization to evaluate the performance of two different rewards. Then, the best-performing agent was tested on two additional driving cycles taken from the Environmental Protection Agency (EPA) regulatory framework: the Federal Test Procedure-75 (FTP75) and the Highway Fuel Economy Test (HFET), representative of urban and highway driving scenarios, respectively. The best-performing SAC model achieved results close to the DP reference on the WLTC, with a limited gap (lower than 9%) in terms of fuel consumption over all the testing cycles.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2992425
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo