Designing a Fuzzy Q-Learning Power Energy System Using Reinforcement Learning

Designing a Fuzzy Q-Learning Power Energy System Using Reinforcement Learning

Avanija J., Suneetha Konduru, Vijetha Kura, Grande NagaJyothi, Bhanu Prakash Dudi, Mani Naidu S.
Copyright: © 2022 |Pages: 12
DOI: 10.4018/IJFSA.306284
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Modern power and energy systems are becoming more complicated and uncertain as distributed energy resources (DERs), flexible loads, and other developing technologies become more integrated. This brings great challenges to the operation and control. Furthermore, the deployment of modern sensor and smart metres generates a considerable amount of data, which opens the door to fresh data-driven ways for dealing with complex operation and control difficulties. One of the most commonly touted strategies for control and optimization problems is reinforcement learning (RL). Designing a fuzzy Q-learning power energy system using RL technique will control and reduce the problems arranging in the energy system.
Article Preview
Top

1. Introduction

Global warming is one of the most serious and pressing issues of our life. Human and industrial activity sectors have relied on fossil fuels (coal, oil, and gas) source of energy for many years. This energy source, on the other hand, is not only unsustainable, but it is also driving global warming. Energy transition, i.e., decarbonizing energy supply by switching to renewable sources and reducing consumption through higher efficiency, is a primary goal of the world (Perera, et.al 2021),( Duchesne et.al, 2020). When we observe the energy roadmap 2020 of the world intends to eliminate 20% of greenhouse gas emissions, increase the amount of renewable energy to 20% of consumption, and achieve a 20% gain in energy efficiency towards 1990 levels. Several recent studies have started to explicitly include the availability and unique characteristics of renewable energy resources in the operation of cellular network infrastructure. In off-grid scenarios (desert, island, etc.) where access to power grid is unavailable and the use of diesel generators is too expensive, renewable energy adoption is required. The major goal is to replace energy sources with an environmentally friendly solution that reduces mobile networks' carbon gases while also enabling large long-term savings on MNO operating energy expenses through efficient information in renewable energy methods. Figure.1 shows the different types storage techniques such as Electro-chemical storage like batteries, kinetic storage like flywheels, mechanic storage like pumped hydro storage, & other storage methods are available (Sharmin et.al,2021) (Alfaverh et.al,2020). In this paper we focus on battery storage which is used with photo voltaic installations at the distribution network level.  The battery will convert chemical energy into electrical energy.  A battery is a device that transforms chemical energy to electrical energy. Electrons pass from one substance (electrode) to another in a battery's chemical processes via an external circuit. Electron flow generates an electric current that can be used to do work. It can be built on a variety of chemistries, which impact energy density of the battery, load characteristics, maintenance requirements, self-discharge, and operational costs (Heo et.al 2019),(Sun,2019),(Nie,2017).

The growing popularity of renewable energy sources (RES) provides an opportunity for researchers and utilities to build transitive energy management systems, often known as microgrids. A single household energy management system (HEMS) will be able to spend available resources from RES by adding energy storage systems (ESS). The ESS's capacity could be changed further, resulting in a more efficient and optimised HEMS. Several DR techniques have been proposed in a variety of papers. YALMIP toolboxes were used by many to present solutions with a linear method. RL has been presented as another technique to handle the DR scheduling problem using artificial intelligence (AI). Reinforcement learning had previously been used in energy market studies, but few people thought of using it for load scheduling.

Figure 1.

Classification of energy Storage Methods

IJFSA.306284.f01

Complete Article List

Search this Journal:
Reset
Volume 13: 1 Issue (2024)
Volume 12: 1 Issue (2023)
Volume 11: 4 Issues (2022)
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 4 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing