IMPROVING PREFERENCE PREDICTION ACCURACY WITH FEATURE LEARNING

Motivated by continued interest within the design community to model design preferences, this paper investigates the question of predicting preferences with particular application to consumer purchase behavior: How can we obtain high prediction accuracy in a consumer preference model using market purchase data? To this end, we employ sparse coding and sparse restricted Boltzmann machines, recent methods from machine learning, to transform the original market data into a sparse and high-dimensional representation. We show that these ‘feature learning’ techniques, which are independent from the preference model itself (e.g., logit model), can complement existing efforts towards high-accuracy preference prediction. Using actual passenger car market data, we achieve significant improvement in prediction accuracy on a binary preference task by properly transforming the original consumer variables and passenger car variables to a sparse and high-dimensional representation.

[1]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[2]  John R. Hauser,et al.  Fast Polyhedral Adaptive Conjoint Estimation , 2002 .

[3]  T. Evgeniou,et al.  Disjunctions of Conjunctions, Cognitive Simplicity, and Consideration Sets , 2010 .

[4]  Geoffrey E. Hinton,et al.  Phone Recognition with the Mean-Covariance Restricted Boltzmann Machine , 2010, NIPS.

[5]  Paul Smolensky,et al.  Information processing in dynamical systems: foundations of harmony theory , 1986 .

[6]  David J. Field,et al.  Sparse coding with an overcomplete basis set: A strategy employed by V1? , 1997, Vision Research.

[7]  Tara N. Sainath,et al.  Deep Belief Networks using discriminative features for phone recognition , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[8]  Yoshua Bengio,et al.  Scaling learning algorithms towards AI , 2007 .

[9]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[10]  Wei Chen,et al.  An Approach to Decision-Based Design With Discrete Choice Analysis for Demand Modeling , 2003 .

[11]  Wei Chen,et al.  Predicting Consumer Choice Set Using Product Association Network and Data Analytics , 2013, DAC 2013.

[12]  G. Kane Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models , 1994 .

[13]  Wei Chen,et al.  Decision Making in Engineering Design , 2006 .

[14]  Olivier Toubia,et al.  Eliciting Consumer Preferences Using Robust Adaptive Choice Questionnaires , 2008, IEEE Transactions on Knowledge and Data Engineering.

[15]  Wei Chen,et al.  INCORPORATING CUSTOMER PREFERENCES AND MARKET TRENDS IN VEHICLE PACKAGE DESIGN , 2007, DAC 2007.

[16]  Allan D. Shocker,et al.  Consideration set influences on consumer decision-making and choice: Issues, models, and suggestions , 1991 .

[17]  Honglak Lee,et al.  Unsupervised feature learning for audio classification using convolutional deep belief networks , 2009, NIPS.

[18]  Wei Chen,et al.  Enhancing Discrete Choice Demand Modeling for Decision-Based Design , 2003 .

[19]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[20]  Geoffrey E. Hinton,et al.  Semantic hashing , 2009, Int. J. Approx. Reason..

[21]  Eric T. Bradlow,et al.  Automatic Construction of Conjoint Attributes and Levels from Online Customer Reviews , 2007 .

[22]  Clifford Goodman,et al.  American Society of Mechanical Engineers , 1988 .

[23]  Geoffrey E. Hinton,et al.  Restricted Boltzmann machines for collaborative filtering , 2007, ICML '07.

[24]  Terrence J. Sejnowski,et al.  Learning Overcomplete Representations , 2000, Neural Computation.

[25]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.

[26]  Honglak Lee,et al.  Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations , 2009, ICML '09.

[27]  John Rust,et al.  A nested logit model of automobile holdings for one vehicle households , 1985 .

[28]  Rajat Raina,et al.  Efficient sparse coding algorithms , 2006, NIPS.

[29]  Jeremy J. Michalek,et al.  Linking Marketing and Engineering Product Design Decisions via Analytical Target Cascading , 2005 .

[30]  D. McFadden,et al.  MIXED MNL MODELS FOR DISCRETE RESPONSE , 2000 .

[31]  Geoffrey E. Hinton,et al.  A Scalable Hierarchical Distributed Language Model , 2008, NIPS.

[32]  Antonio Torralba,et al.  Small codes and large image databases for recognition , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[33]  Bernard Yannou,et al.  Choice Modeling for Usage Context-Based Design , 2012 .

[34]  Honglak Lee,et al.  Sparse deep belief net model for visual area V2 , 2007, NIPS.

[35]  Panos Y. Papalambros,et al.  Perceptual Attributes in Product Design: Fuel Economy and Silhouette-Based Perceived Environmental Friendliness Tradeoffs in Automotive Vehicle Design , 2012 .

[36]  M. Pontil,et al.  A Convex Optimization Approach to Modeling Consumer Heterogeneity in Conjoint Estimation , 2007 .

[37]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[38]  Honglak Lee,et al.  An Analysis of Single-Layer Networks in Unsupervised Feature Learning , 2011, AISTATS.

[39]  Zaïd Harchaoui,et al.  A Machine Learning Approach to Conjoint Analysis , 2004, NIPS.

[40]  Honglak Lee,et al.  Unsupervised learning of hierarchical representations with convolutional deep belief networks , 2011, Commun. ACM.