SAVES: a sustainable multiagent application to conserve building energy considering occupants

This paper describes an innovative multiagent system called SAVES with the goal of conserving energy in commercial buildings. We specifically focus on an application to be deployed in an existing university building that provides several key novelties: (i) jointly performed with the university facility management team, SAVES is based on actual occupant preferences and schedules, actual energy consumption and loss data, real sensors and hand-held devices, etc.; (ii) it addresses novel scenarios that require negotiations with groups of building occupants to conserve energy; (iii) it focuses on a non-residential building, where human occupants do not have a direct financial incentive in saving energy and thus requires a different mechanism to effectively motivate occupants; and (iv) SAVES uses a novel algorithm for generating optimal MDP policies that explicitly consider multiple criteria optimization (energy and personal comfort) as well as uncertainty over occupant preferences when negotiating energy reduction -- this combination of challenges has not been considered in previous MDP algorithms. In a validated simulation testbed, we show that SAVES substantially reduces the overall energy consumption compared to the existing control method while achieving comparable average satisfaction levels for occupants. As a real-world test, we provide results of a trial study where SAVES is shown to lead occupants to conserve energy in real buildings.

[1]  Ardeshir Mahdavi,et al.  An agent-based simulation-assisted approach to bi-lateral building systems control , 2003 .

[2]  C. Vlek,et al.  A review of intervention studies aimed at household energy conservation , 2005 .

[3]  Milind Tambe,et al.  Towards Adjustable Autonomy for the Real World , 2002, J. Artif. Intell. Res..

[4]  Robert Givan,et al.  Bounded-parameter Markov decision processes , 2000, Artif. Intell..

[5]  Ahmad Faruqui,et al.  The impact of informational feedback on energy consumption d A survey of the experimental evidence , 2010 .

[6]  Stéphane Ploix,et al.  A multi-agent home automation system for power management , 2006, ICINCO-ICSO.

[7]  A. Carrico,et al.  Motivating energy conservation in the workplace: An evaluation of the use of group-level feedback and peer education , 2011 .

[8]  Sarvapali D. Ramchurn,et al.  Agent-based control for decentralised demand side management in the smart grid , 2011, AAMAS.

[9]  Patrice Perny,et al.  A Compromise Programming Approach to multiobjective Markov Decision Processes , 2011, Int. J. Inf. Technol. Decis. Mak..

[10]  Scott Sanner,et al.  Efficient Solutions to Factored MDPs with Imprecise Transition Probabilities , 2009, ICAPS.

[11]  A. Osyczka An approach to multicriterion optimization problems for engineering design , 1978 .

[12]  Milind Tambe,et al.  Improving adjustable autonomy strategies for time-critical domains , 2009, AAMAS.

[13]  Burcin Becerik-Gerber,et al.  Continuous Sensing of Occupant Perception of Indoor Ambient Factors , 2011 .

[14]  Sarvapali D. Ramchurn,et al.  Decentralised Control of Micro-Storage in the Smart Grid , 2011, AAAI.

[15]  Milind Tambe,et al.  ESCAPES: evacuation simulation with children, authorities, parents, emotions, and social comparison , 2011, AAMAS.

[16]  Willett Kempton,et al.  Deploying power grid-integrated electric vehicles as a multi-agent system , 2011, AAMAS.

[17]  Ching-Lai Hwang,et al.  Multiple attribute decision making : an introduction , 1995 .

[18]  Thomas A. Henzinger,et al.  Markov Decision Processes with Multiple Objectives , 2006, STACS.

[19]  M. McLure One Hundred Years from Today: Vilfredo Pareto, Manuale di Economia Politica con una Introduzione alla Scienza Sociale, Milan: Societa Editrice Libraria. 1906 , 2006 .

[20]  Nicholas R. Jennings,et al.  Adaptive home heating control through Gaussian process prediction and mathematical programming , 2011 .

[21]  John F. Dannenhoffer,et al.  Energy Efficiency of Distributed Environmental Control Systems , 2011 .