On Bellman equations of ergodic control in n.

We are interested in the study of equations of the form (1.1) Lu + H(x, Vu} + λ = q(x), χεΜ" where L is a 2 order differential operator Lu = — D^^D^u and H is a non linear function of V u, called the Hamiltonian. Now in (1.1) λ is constant. One should view (u, λ) s a pair of unknowns and observe that u is defined up to an additive constant. Equation (1.1) is called the Bellman equation of ergodic control. It arises naturally in the context of control theory, s the limit of the following problem (1.2) Lua + H(x^ua) + ocua = q s α tends to 0. Consider s a model problem the case (1.3) -Δηα + \?ΐ4Λ\ which can be solved explicitly by the formula