Wood moisture content prediction using feature selection techniques and a kernel method

Wood is a renewable, abundant bio-energy and environment friendly resource. Woody biomass Moisture Content (MC) is a key parameter for controlling the biofuel product qualities and properties. In this paper, we are interested in predicting MC from data. The input impedance of half-wave dipole antenna when buried in the wood pile varies according to the permittivity of wood. Hence, the measurement of reflection coefficient, that gives information about the input impedance, depends directly on the MC of wood. The relationship between the reflection coefficient measurements and the MC is studied. Based upon this relationship, MC predictive models that use machine learning techniques and feature selection methods are proposed. Numerical experiments using real world data show the relevance of the proposed approach that requires a limited computational power. Therefore, a real-time implementation for industrial processes is feasible. HighlightsThe prediction of moisture content for two wood chips species using the wood dielectric property is studied.Nonlinear models are built to predict the reflection coefficient values from frequencies.Those reflection coefficients are used as input variables of a moisture content predictive model designed using Least Squares Support Vector Machines (LS-SVM) technique and feature selection methods.Numerical experiments using real world data show the effectiveness of the proposed methodology that requires a limited computational power.

[1]  Feiping Nie,et al.  Efficient and Robust Feature Selection via Joint ℓ2, 1-Norms Minimization , 2010, NIPS.

[2]  H. Brogniez,et al.  A layer-averaged relative humidity profile retrieval for microwave observations: design and results for the Megha-Tropiques payload , 2015 .

[3]  Hans Hartmann,et al.  Moisture content determination in solid biofuels by dielectric , 2006 .

[4]  Carla E. Brodley,et al.  Feature Selection for Unsupervised Learning , 2004, J. Mach. Learn. Res..

[5]  Ke Tang,et al.  Margin-Based Over-Sampling Method for Learning from Imbalanced Datasets , 2011, PAKDD.

[6]  Raida Jirjis,et al.  Comparison of different methods for the determination of moisture content in biomass. , 2006 .

[7]  Gérard Dreyfus,et al.  Neural networks - methodology and applications , 2005 .

[8]  Gérard Dreyfus,et al.  Initialization by selection for wavelet network training , 2000, Neurocomputing.

[9]  Gene H. Golub,et al.  Matrix computations , 1983 .

[10]  Hela Daassi-Gnaba,et al.  External vs. Internal SVM-RFE: The SVM-RFE Method Revisited and Applied to Emotion Recognition , 2015 .

[11]  Radford M. Neal Connectionist Learning of Belief Networks , 1992, Artif. Intell..

[12]  Carsten Peterson,et al.  Finding the Embedding Dimension and Variable Dependencies in Time Series , 1994, Neural Computation.

[13]  Roberto Di Pietro,et al.  "Who Counterfeited My Viagra?" Probabilistic Item Removal Detection via RFID Tag Cooperation , 2011, EURASIP J. Wirel. Commun. Netw..

[14]  S. Chatterjee,et al.  Influential Observations, High Leverage Points, and Outliers in Linear Regression , 1986 .

[15]  Consolación Gil,et al.  Scientific production of renewable energies worldwide: An overview , 2013 .

[16]  Michel Verleysen,et al.  Using the Delta Test for Variable Selection , 2008, ESANN.

[17]  Michel Verleysen,et al.  Residual variance estimation in machine learning , 2009, Neurocomputing.

[18]  Feiping Nie,et al.  Efficient semi-supervised feature selection with noise insensitive trace ratio criterion , 2013, Neurocomputing.

[19]  Gérard Dreyfus,et al.  Ranking a Random Feature for Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[20]  Sayan Mukherjee,et al.  Feature Selection for SVMs , 2000, NIPS.

[21]  Moisture Content Measurements on Sawdust with Radio Frequency Spectroscopy , 2005 .

[22]  Changhe Yuan,et al.  Finding Optimal Bayesian Network Structures with Constraints Learned from Data , 2014, UAI.

[23]  Erik Dahlquist,et al.  Methods for determination of moisture content in woodchips for power plants—a review , 2004 .

[24]  R. Dennis Cook,et al.  Leverage, local influence and curvature in nonlinear regression , 1993 .

[25]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[26]  Bruce Denby,et al.  Indoor localization based on cellular telephony RSSI fingerprints containing very large numbers of carriers , 2011, EURASIP J. Wirel. Commun. Netw..

[27]  D. Marquardt An Algorithm for Least-Squares Estimation of Nonlinear Parameters , 1963 .

[28]  Nenad Koncar,et al.  A note on the Gamma test , 1997, Neural Computing & Applications.

[29]  Kenneth Levenberg A METHOD FOR THE SOLUTION OF CERTAIN NON – LINEAR PROBLEMS IN LEAST SQUARES , 1944 .

[30]  Feiping Nie,et al.  A general kernelization framework for learning algorithms based on kernel PCA , 2010, Neurocomputing.

[31]  Cao Jun,et al.  Comparison on prediction wood moisture content using ARIMA and improved neural networks , 2009, 2009 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications.

[32]  Jun Cao,et al.  Soft sensor modeling of moisture content in drying process based on LSSVM , 2009, 2009 9th International Conference on Electronic Measurement & Instruments.

[33]  Kazuyuki Murase,et al.  A new wrapper feature selection approach using neural network , 2010, Neurocomputing.

[34]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[35]  Sanmay Das,et al.  Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection , 2001, ICML.

[36]  Udo Kaatze,et al.  The Dielectric Properties of Water at Microwave Frequencies , 1981 .

[37]  Gérard Dreyfus,et al.  Withdrawing an example from the training set: An analytic estimation of its effect on a non-linear parameterised model , 2000, Neurocomputing.

[38]  W. W. Muir,et al.  Regression Diagnostics: Identifying Influential Data and Sources of Collinearity , 1980 .

[39]  Gavin C. Cawley,et al.  Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters , 2007, J. Mach. Learn. Res..

[40]  Feiping Nie,et al.  Trace Ratio Criterion for Feature Selection , 2008, AAAI.

[41]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[42]  Lazaros S. Iliadis,et al.  Hybrid e-regression and validation soft computing techniques: The case of wood dielectric loss factor , 2013, Neurocomputing.

[43]  Changhe Yuan,et al.  An Improved Lower Bound for Bayesian Network Structure Learning , 2015, AAAI.

[44]  Gérard Dreyfus,et al.  Local Overfitting Control via Leverages , 2002, Neural Computation.

[45]  José Ramón Quevedo,et al.  A simple and efficient method for variable ranking according to their usefulness for learning , 2007, Comput. Stat. Data Anal..

[46]  Lazaros S. Iliadis,et al.  Predicting Wood Thermal Conductivity Using Artificial Neural Networks , 2007 .

[47]  Changhe Yuan,et al.  Tightening Bounds for Bayesian Network Structure Learning , 2014, AAAI.

[48]  Jidong Zhao,et al.  Locality sensitive semi-supervised feature selection , 2008, Neurocomputing.

[49]  S. Ramo,et al.  Fields and Waves in Communication Electronics , 1966 .