A simulation environment for training a reinforcement learning agent trading a battery storage

Loading...
Thumbnail Image

Access rights

openAccess
publishedVersion

URL

Journal Title

Journal ISSN

Volume Title

A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä

Date

2021-09-06

Major/Subject

Mcode

Degree programme

Language

en

Pages

20

Series

Energies, Volume 14, issue 17

Abstract

Battery storages are an essential element of the emerging smart grid. Compared to other distributed intelligent energy resources, batteries have the advantage of being able to rapidly react to events such as renewable generation fluctuations or grid disturbances. There is a lack of research on ways to profitably exploit this ability. Any solution needs to consider rapid electrical phenomena as well as the much slower dynamics of relevant electricity markets. Reinforcement learning is a branch of artificial intelligence that has shown promise in optimizing complex problems involving uncertainty. This article applies reinforcement learning to the problem of trading batteries. The problem involves two timescales, both of which are important for profitability. Firstly, trading the battery capacity must occur on the timescale of the chosen electricity markets. Secondly, the real-time operation of the battery must ensure that no financial penalties are incurred from failing to meet the technical specification. The trading‐related decisions must be done under uncertainties, such as unknown future market prices and unpredictable power grid disturbances. In this article, a simulation model of a battery system is proposed as the environment to train a reinforcement learning agent to make such decisions. The system is demonstrated with an application of the battery to Finnish primary frequency reserve markets.

Description

Funding Information: Funding: This research was supported by Business Finland grant 7439/31/2018. Publisher Copyright: © 2021 by the authors. Licensee MDPI, Basel, Switzerland.

Keywords

Artificial intelligence, Battery, Electricity market, Frequency containment reserve, Frequency reserve, Real‐time, Reinforcement learning, Simulation, Timescale

Other note

Citation

Aaltonen, H, Sierla, S, Subramanya, R & Vyatkin, V 2021, ' A simulation environment for training a reinforcement learning agent trading a battery storage ', Energies, vol. 14, no. 17, 5587 . https://doi.org/10.3390/en14175587