Performance of optimized solar central receiver systems as a function of receiver thermal loss per unit area

Abstract Recent efforts in solar central receiver research have been directed toward high-temperature applications. Associated with high-termperature processes are greater receiver thermal losses due to thermal radiation and convection. This article examines the performance of central receiver systems having optimum heliostat fields and receiver aperture areas as a function of receiver thermal loss per unit area. The results address the problem of application optimization, where the receiver design, temperature and consequently thermal loss per unit area may vary. A reasonable range of values for the primary independent variable L (the average thermal loss per unit area of receiver aperture) and a reasonable set of design assumptions were first established. Heliostat field analysis and optimization required a detailed computational analysis. Results are discussed for tower focal heights of 150 and 180 m. Values of L ranging from 0.04 to 0.50 MW per square meter were considered, roughly corresponding to working fluid temperatures in the range of 650–1650°C. As L increases over this range, the receiver thermal efficiency and the receiver interception factor decrease. The optimal power level drops by almost half, and the cost per unit of energy produced increases by about 25% for the base case set of design assumptions. The resulting decrease in solar subsystem efficiency (relative to the defined annual input energy) from 0.57 to 0.35 is about 40% and is a significant effect. Unoptimized systems would experience an even greater degradation in cost-effectiveness.