High-Dimensional Neural-Based Outlier Detection: Unterschied zwischen den Versionen

Aus IPD-Institutsseminar
Zur Navigation springen Zur Suche springen
(Die Seite wurde neu angelegt: „{{Vortrag |vortragender=Daniel Popovic |email=daniel.popovic@live.de |vortragstyp=Diplomarbeit |betreuer=Edouard Fouché |termin=Institutsseminar/2017-10-06 |k…“)
 
 
Zeile 5: Zeile 5:
 
|betreuer=Edouard Fouché
 
|betreuer=Edouard Fouché
 
|termin=Institutsseminar/2017-10-06
 
|termin=Institutsseminar/2017-10-06
|kurzfassung=Outlier detection in high-dimensional spaces is a challenging task because of consequences of the curse of dimensionality. Neural networks have recently gained in popularity for a wide range of applications due to the availability of computational power and large training data sets. Several studies examine the application of different neural network models, such an autoencoder, self-organising maps and restricted Boltzmann machines, for outlier detection in mainly low-dimensional data sets. In this diploma thesis we investigate if these neural network models can scale to high-dimensional spaces, adapt the useful neural network-based algorithms to the task of high-dimensional outlier detection, examine data-driven parameter selection strategies for these algorithms, develop suitable outlier score metrics for these models and investigate the possibility of identifying the outlying subspace for detected outliers.
+
|kurzfassung=Outlier detection in high-dimensional spaces is a challenging task because of consequences of the curse of dimensionality. Neural networks have recently gained in popularity for a wide range of applications due to the availability of computational power and large training data sets. Several studies examine the application of different neural network models, such an autoencoder, self-organising maps and restricted Boltzmann machines, for outlier detection in mainly low-dimensional data sets. In this diploma thesis we investigate if these neural network models can scale to high-dimensional spaces, adapt the useful neural network-based algorithms to the task of high-dimensional outlier detection, examine data-driven parameter selection strategies for these algorithms, develop suitable outlier score metrics for these models and investigate the possibility of identifying the outlying dimensions for detected outliers.
 
}}
 
}}

Aktuelle Version vom 7. September 2017, 15:13 Uhr

Vortragende(r) Daniel Popovic
Vortragstyp Diplomarbeit
Betreuer(in) Edouard Fouché
Termin Fr 6. Oktober 2017
Vortragsmodus
Kurzfassung Outlier detection in high-dimensional spaces is a challenging task because of consequences of the curse of dimensionality. Neural networks have recently gained in popularity for a wide range of applications due to the availability of computational power and large training data sets. Several studies examine the application of different neural network models, such an autoencoder, self-organising maps and restricted Boltzmann machines, for outlier detection in mainly low-dimensional data sets. In this diploma thesis we investigate if these neural network models can scale to high-dimensional spaces, adapt the useful neural network-based algorithms to the task of high-dimensional outlier detection, examine data-driven parameter selection strategies for these algorithms, develop suitable outlier score metrics for these models and investigate the possibility of identifying the outlying dimensions for detected outliers.