Optimal energy management strategies for electric vehicles: advanced control and learning-based perspectives

Date

2022-05-02

Authors

Zhang, Qian

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Motivated by the goal of transition to a zero-carbon-emission-based economy for climate change mitigation, electrification opportunities are more promising in the transportation sector. Electric Vehicles (EVs) are at the forefront of the energy transition at an expanded rapid pace in the transportation sector. To enable and enhance the energy efficiency, advanced control and optimization will play an important role in EV systems and infrastructure. However, there are also some difficulties and limitations subject to the imperfection of management and control for EVs. Overall, to further the widespread adoption of EVs, the dissertation mainly includes two parts: 1) Power management for Plug-in Hybrid Electric Vehicles (PHEVs); 2) Charging control for Plug-in Electric Vehicles (PEVs). Chapter 2 deals with the power management and route planning problems for PHEVs, which aims to properly design the control algorithm to find the route that leads to the minimum energy consumption. Chapter 3 pays attention to the high workloads of the PEV in the electric power grids, which concentrates on studying a control algorithm leading to possible reductions in both computation and communication. Chapter 4 focuses on the charging control for PEVs, which explores how to improve the PEV charging efficiency while satisfying safety concerns. Chapter 5 modifies the results in Chapter 4 by taking battery capacity degradation into the optimization problem. This dissertation proceeds with Chapter 1 by reviewing the state-of-the-art control methods for PEVs and PHEVs. Chapter 2 studies a novel control scheme of route planning with power management for PHEVs. By considering the power management of PHEVs, we aim to find the route that leads to the minimum energy consumption. The scheme adopts a two-loop structure to achieve the control objective. Specifically, in the outer loop, the minimum energy consumption route is obtained by minimizing the difference between the value function of current round and the best value from all previous rounds. In the inner loop, the energy consumption index with respect to PHEV power management for each feasible route is trained with Reinforcement Learning (RL). Under the RL framework, a nonlinear approximator structure, which consists of an actor approximator and a critic approximator, is built to approximate control actions and energy consumption. In addition, the convergence of value function for PHEV power management in the inner loop and asymptotical stability of the closed-loop system are rigorously guaranteed. Chapter 3 investigates the self-triggered Model Predictive Control (MPC) with Integral Sliding Mode (ISM) method of a networked nonlinear continuous-time system subject to state and input constraints with additive disturbances and uncertainties. Compared with the standard MPC strategy, the proposed control scheme is designed for PEV charging to reduce the high communication loads caused by a large-scale population of vehicles under centralized charging control architecture. In the proposed scheme, the constrained optimization problem is solved aperiodically to generate control signals and the next execution time, leading to possible reductions in both computation and communication. The motivation of using ISM approach is to reject matched uncertainties. A self-triggered condition that involves a comparison between the cost function values with different execution periods is derived. Besides, the robust MPC with ISM control strategy is rigorously studied depending on the self-triggered scheme. Chapter 4 proposes a charging control algorithm for the valley-filling problem, while it meets individual charging requirements. We study a decentralized framework of PEV charging problem with a coordination task. An iterative learning-based model predictive charging control algorithm is developed to achieve the valley-filling performance. The design of the decentralized MPC meets individual charging requirements. The iterative learning method approximates the electricity price function and the system state sampled safe set to improve the accuracy of optimization problem calculations. The decentralized problem, in which the individual PEV minimizes its own charging cost, is formulated based on the sum of all power loads. Chapter 5 studies a modified charging control algorithm based on the previous charging control algorithm in Chapter 4. We propose a charging control algorithm for PEVs using a decentralized MPC framework supplemented by the iterative learning method. By considering the battery aging of PEVs, we aim to find the optimal charging rate that leads to valley-filling performance. The scheme adopts the iterative learning-based method to solve the optimal control problem with the battery aging model. Specifically, the sampled safe set and price function are updated accordingly as the iteration number increases. The battery aging model involves the cost function to approach the real charging scenario. In addition, the recursive feasibility of the proposed optimal control problem for PEV charging with battery aging and asymptotical stability of the closed-loop system are rigorously studied. Finally, in Chapter 6, the conclusions of the dissertation and some avenues for future potential research are presented.

Description

Keywords

power management, charging control, model predictive control

Citation