Combining self-explicated priors with conjoint data using Bayesian regression

A Bayesian regression procedure (RBAYES) is proposed for the optimal combination of self-explicated data (priors) and conjoint judgments. The procedure does not require the design matrix for the conjoint judgments to be of full rank. The Bayesian regression procedure is similar to weighted least square in that it uses an information ratio to weight the priors. We provide empirical comparisons for the proposed method against (1) a Stein-type estimator (SBAYES) using one data set and (2) OLS applied to the data from an adaptive conjoint analysis using a second data set. In the second application we also use an alternating least squares procedure by itself and in combination with Bayesian regression (RBAYES+) to accommodate scale incompatibility as well as heteroscedasticity. In both applications we obtain superior results for the Bayesian regression procedure.