Information-Theoretic Performance Limitations of Feedback Control: Underlying Entropic Laws and Generic $\mathcal{L}_{p}$ Bounds

In this paper, we utilize information theory to study the fundamental performance limitations of generic feedback systems, where both the controller and the plant may be any causal functions/mappings while the disturbance can be with any distributions. Particularly, we obtain fundamental $\mathcal{L}_p$ bounds on the control error, which are shown to be completely characterized by the conditional entropy of the disturbance, based upon the entropic laws that are inherent in any feedback systems. We also discuss the generality and implications (in, e.g., fundamental limits of learning-based control) of the obtained bounds.