Mostrar el registro sencillo del ítem

dc.contributor.authorHosseini, Ehsan 
dc.contributor.authorHorrillo Quintero, Pablo 
dc.contributor.authorCarrasco González, David 
dc.contributor.authorGarcía Triviño, Pablo 
dc.contributor.authorSarrias Mena, Raúl 
dc.contributor.authorGarcía Vázquez, Carlos Andrés 
dc.contributor.authorFernández Ramírez, Luis Miguel 
dc.contributor.otherIngeniería Eléctricaes_ES
dc.contributor.otherIngeniería en Automática, Electrónica, Arquitectura y Redes de Computadoreses_ES
dc.date.accessioned2025-09-11T09:54:19Z
dc.date.available2025-09-11T09:54:19Z
dc.date.issued2025
dc.identifier.issn2352-1538
dc.identifier.issn2352-152X
dc.identifier.urihttp://hdl.handle.net/10498/37166
dc.description.abstractIn this study, a reinforcement learning (RL) algorithm is utilized within the energy management system (EMS) for battery energy storage systems (BESs) within a multilevel microgrid. This microgrid seamlessly integrates photovoltaic (PV) plants and wind turbines (WT), employing a multilevel configuration based on battery energy-stored quasi-Z-source cascaded H-bridge multilevel inverter (BES-qZS-CHBMLIs). Twin-delayed deep deterministic (TD3) policy gradient agent is implemented as an RL agent to dispatch power between the BES to meet the requested grid power while considering the BES efficiency and lifetime. Two 4.8 kW PV plants and a 5 kW WT, integrating BES with different rated capacities, are connected to the grid through a BES-qZS-CHBMLI configuration, and the resulting microgrid is simulated in MATLAB to evaluate the proposed RLEMS performance. Moreover, a SOC-EMS, a fuzzy logic EMS (FL-EMS), and two nonlinear algorithm-based (PSO and fmincon) EMSs are implemented to compare the results with those obtained by the RL-EMS. The comparison demonstrates the superior performance of the RL-based EMS over other methods, with improvements of up to 18.09% in the integral time absolute error (ITAE) for the active power, 17.77% in the ITAE for the reactive power, and 21.38% in the standard deviation (STD) for the active power compared to the other EMSs based on SOC, fuzzy, fmincon, and PSO. Additionally, the fmincon-EMS shows a notable improvement over other methods, achieving up to 15.12% better performance in power demand tracking and BES dispatch. In a dynamic environment with fluctuating power production and demand, the trained RL system effectively optimizes the power injection or storage between BESs while maintaining grid demand and battery SOC balance.es_ES
dc.formatapplication/pdfes_ES
dc.language.isoenges_ES
dc.publisherElsevier Ltdes_ES
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.sourceJournal of Energy Storage - 2025, vol. 109es_ES
dc.subjectMicrogrid, lithium-ion battery storagees_ES
dc.subjectRenewable energyes_ES
dc.subjectQuasi-Z-source inverteres_ES
dc.subjectReinforcement learninges_ES
dc.subjectEnergy management systemes_ES
dc.titleReinforcement Learning-Based Energy Management System for Lithium-Ion Battery Storage in Multilevel Microgrides_ES
dc.typejournal articlees_ES
dc.rights.accessRightsembargoed accesses_ES
dc.identifier.doi10.1016/J.EST.2024.115114
dc.relation.projectIDinfo:eu-repo/grantAgreement/MCIN/AEI/FEDER/ PID2021-123633OB-C32es_ES
dc.type.hasVersionAMes_ES


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Esta obra está bajo una Licencia Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 Internacional