Simultaneous feature selection and parameter optimisation using an artificial ant colony : case study of melting point prediction

O'Boyle, Noel M and Palmer, David S and Nigsch, Florian and Mitchell, John BO (2008) Simultaneous feature selection and parameter optimisation using an artificial ant colony : case study of melting point prediction. Chemistry Central Journal, 2. p. 21. (https://doi.org/10.1186/1752-153X-2-21)

[thumbnail of O'Boyle-etal-CCJ-2008-Simultaneous-feature-selection-and-parameter-optimisation-using-an-artificial-ant-colony]
Preview
Text. Filename: OBoyle_etal_CCJ_2008_Simultaneous_feature_selection_and_parameter_optimisation_using_an_artificial_ant_colony.pdf
Final Published Version
License: Creative Commons Attribution 2.5 logo

Download (3MB)| Preview

Abstract

We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset.\ Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest.\ With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.