Bayesian Optimization for Wrapper Feature Selection: Unterschied zwischen den Versionen

Aus SDQ-Institutsseminar
Keine Bearbeitungszusammenfassung
Keine Bearbeitungszusammenfassung
 
(Eine dazwischenliegende Version desselben Benutzers wird nicht angezeigt)
Zeile 2: Zeile 2:
|vortragender=Adrian Kruck
|vortragender=Adrian Kruck
|email=uaenk@student.kit.edu
|email=uaenk@student.kit.edu
|vortragstyp=Proposal
|vortragstyp=Masterarbeit
|betreuer=Jakob Bach
|betreuer=Jakob Bach
|termin=Institutsseminar/2019-06-07
|termin=Institutsseminar/2019-12-20
|kurzfassung=Wrapper feature selection can lead to highly accurate classifications. However, the computational costs for this are very high in general. Bayesian Optimization on the other hand has already proven to be very efficient in optimizing black-box functions. This approach uses Bayesian Optimization in order to minimize the number of evaluations, i.e. the training of models with different feature subsets. We will use Gaussian processes, random forests and other regression learners for the surrogate model. On 10 different classification datasets the approach will be compared against established wrapper feature selection methods, but also against filter and embedded methods.
|kurzfassung=Wrapper feature selection can lead to highly accurate classifications. However, the computational costs for this are very high in general. Bayesian Optimization on the other hand has already proven to be very efficient in optimizing black box functions. This approach uses Bayesian Optimization in order to minimize the number of evaluations, i.e. the training of models with different feature subsets. We propose four different ways to set up the objective function for the Bayesian optimization. On 14 different classification datasets the approach is compared against 14 other established feature selection methods, including other wrapper methods, but also filter methods and embedded methods. We use gaussian processes and random forests for the surrogate model. The classifiers which are applied to the selected feature subsets are logistic regression and naive bayes. We compare all the different feature selection methods against each other by comparing their classification accuracies and runtime. Our approach shows to keep up with the most established feature selection methods, but the evaluation also shows that the experimental setup does not value the feature selection enough. Concluding, we give guidelines how an experimental setup can be more appropriate and several concepts are provided of how to develop the Bayesian optimization for wrapper feature selection further.
}}
}}

Aktuelle Version vom 9. Dezember 2019, 14:21 Uhr

Vortragende(r) Adrian Kruck
Vortragstyp Masterarbeit
Betreuer(in) Jakob Bach
Termin Fr 20. Dezember 2019
Vortragsmodus
Kurzfassung Wrapper feature selection can lead to highly accurate classifications. However, the computational costs for this are very high in general. Bayesian Optimization on the other hand has already proven to be very efficient in optimizing black box functions. This approach uses Bayesian Optimization in order to minimize the number of evaluations, i.e. the training of models with different feature subsets. We propose four different ways to set up the objective function for the Bayesian optimization. On 14 different classification datasets the approach is compared against 14 other established feature selection methods, including other wrapper methods, but also filter methods and embedded methods. We use gaussian processes and random forests for the surrogate model. The classifiers which are applied to the selected feature subsets are logistic regression and naive bayes. We compare all the different feature selection methods against each other by comparing their classification accuracies and runtime. Our approach shows to keep up with the most established feature selection methods, but the evaluation also shows that the experimental setup does not value the feature selection enough. Concluding, we give guidelines how an experimental setup can be more appropriate and several concepts are provided of how to develop the Bayesian optimization for wrapper feature selection further.