MARKET-AWARE OPTIMIZATION OF GRID-CONNECTED SOLAR AND WIND FARMS USING REINFORCEMENT LEARNING

Authors

  • Hiren Chandrakant Pathak Lecturer Government Polytechnic Godhra.

DOI:

https://doi.org/10.29121/ijoest.v7.i5.2023.724

Keywords:

Reinforcement Learning, Solar–Wind Hybrid System, Grid-Connected Farms, Market-Aware Dispatch, Energy Optimization, Smart Grids

Abstract

The enhanced integration of the renewable energy within the current power systems have necessitated the need to possess smart and flexible and economically feasible dispatch strategies. Though solar and wind farms do not pollute the environment, they are extremely unreliable as a result of weather variations and therefore conditions need real-time decision-making, especially where there is a grid connection and therefore the market prices are prone to change. This research paper outlines an optimisation strategy of a Market-Aware instead of Learner on Reinforcement Learning (RL) to optimize the functionality, grid stability and economic operations of united solar-wind farms.
It is a framework integrating all of the dynamic electricity market physicals, time-of-use tariffs, demand-response trends, and renewable generation foresees in a single RL environment. The deep actor-critic RL model is optimized to optimize solution according to energy dispatch, e.g. the extent of power supplied to the grid, the extent of power charged and discharged battery storage and when and how to curtail the system strategically. The reward system is established in order to push the overall cost of operation, power wastage and maximize profits of the market involvement.
The RL agent can learn adaptive strategies much better than the conventional rule-based and deterministic optimization methods through the application of real-world solar and wind generation data in the shape of a simulation model. Results show that curtailment had significantly reduced, there was increased utilization of storage and the interactions on the grid and improved economic returns. The model of the real-time market fluctuations is dynamically adjusted and decisions made to afford the flexibility of the profitable operations in the farms.
The paper provides sufficient content evidence, that the market-conscious RL may be implemented as a strategic control instrument in the following generation of smart renewable energy farms that may be used to increase the energy resilience, sustainability and economic maximization in the next generation power networks

Downloads

Download data is not yet available.

References

Ahmadi, A., Neshan, A., & Moradi, M. H. (2014). Optimal Operation of Hybrid Renewable Energy Systems using Stochastic Programming. Renewable Energy, 68, 225–233. https://doi.org/10.1016/j.renene.2014.02.004

Avagaddi, A. I., & Salloum, F. (2017). Multi-Objective Energy Management of Solar–Wind Hybrid Microgrids using Rl-Based Optimization. Energy Procedia, 141, 403–410. https://doi.org/10.1016/j.egypro.2017.11.048

Bui, T., Nguyen, P., & Trinh, H. (2020). Actor–Critic Reinforcement Learning for Hybrid Renewable Systems Under Uncertain Markets. Electric Power Systems Research, 189, 106756. https://doi.org/10.1016/j.epsr.2020.106756

Chopra, S., & Kolhe, S. R. (2016). Smart Grid Predictive Control for Hybrid Solar–Wind Systems. Renewable Energy, 97, 110–121. https://doi.org/10.1016/j.renene.2016.05.036

Kusiak, A., & Li, W. (2002). Renewable Energy Optimization: Wind Farm Power Generation. Renewable Energy, 28(9), 1331–1341. https://doi.org/10.1016/S0960-1481(02)00018-7

Patel, S. R., & Gupta, A. (2023). Deep Reinforcement Learning-Based Energy Management in Hybrid Renewable Systems Under Uncertain Market Conditions. Renewable Energy, 205, 1–12. https://doi.org/10.1016/j.renene.2023.01.045

Rahman, K., & Kim, J. (2022). RL-based Economic Dispatch for Solar–Wind Microgrids with Real-Time Pricing. International Journal of Electrical Power & Energy Systems, 143, 108515. https://doi.org/10.1016/j.ijepes.2022.108515

Samadi, P., Mohsenian-Rad, H., Schober, R., & Wong, V. W. S. (2017). Real-Time Pricing for Smart Grid Demand Response Based on Reinforcement learning. IEEE Transactions on Smart Grid, 8(2), 655–666. https://doi.org/10.1109/TSG.2015.2468584

Singh, J., & Rao, V. (2021). Reinforcement Learning-Based Dispatch of PV Energy in Electricity Markets. Solar Energy, 221, 375–384. https://doi.org/10.1016/j.solener.2021.04.014

Sioshansi, R., & Short, W. (2005). Evaluating the Impacts of Real-Time Pricing on the usage of Wind Generation. IEEE Transactions on Power Systems, 20(2), 554–563. https://doi.org/10.1109/TPWRS.2005.846093

Xu, L., & Chen, D. (2007). Control and Operation of a Hybrid Solar–Wind Energy System. IEEE Transactions on Energy Conversion, 22(2), 295–302. https://doi.org/10.1109/TEC.2006.889609

Yang, F., & Zhang, L. (2021). Optimal Control of Hybrid Renewable Systems using PPO Reinforcement Learning. Energy Reports, 7, 6967–6977. https://doi.org/10.1016/j.egyr.2021.09.080

Downloads

Published

2023-10-31

How to Cite

Pathak, H. C. (2023). MARKET-AWARE OPTIMIZATION OF GRID-CONNECTED SOLAR AND WIND FARMS USING REINFORCEMENT LEARNING. International Journal of Engineering Science Technologies, 7(5), 88–97. https://doi.org/10.29121/ijoest.v7.i5.2023.724