IEEE Access (Jan 2024)

RIS-Assisted CR-MEC Systems Using Deep Reinforcement Learning Approach

  • Pham Duy Thanh,
  • Hoang Thi Huong Giang,
  • Ic-Pyo Hong

DOI
https://doi.org/10.1109/ACCESS.2024.3522783
Journal volume & issue
Vol. 12
pp. 198167 – 198183

Abstract

Read online

Nowadays, running applications on devices is restricted by limited computational capability and battery capacity, which needs the aid of mobile edge computing (MEC). This article aims to enhance MEC systems by employing revolutionary techniques: 1) adjusting wireless environment using reconfigurable intelligent surface (RIS), 2) exploiting opportunistically licensed spectrum using cognitive radio (CR), and 3) scavenging renewable energy using energy harvesting. We consider a RIS-assisted CR-MEC network where a secondary user (SU) powered by solar energy attempts to optimize computation. We study the computation rate maximization of SU, where it opportunistically either performs local computation or offloads data to the MEC server on a primary channel. A Markov decision process problem is formulated and then is firstly solved by a proposed deep policy gradient scheme, in which the system directly learns the policy from gradients of actions. To obtain higher stability, we subsequently propose a deep Q-learning scheme to derive a proper solution by maximizing the state-action value function. By taking the advantages of both policy-based and value-based methods, we further develop a deep actor-critic scheme, where an actor selects actions and a critic evaluates actions to acquire an optimal policy.

Keywords