Institutsseminar/2021-06-25: Unterschied zwischen den Versionen

Aus SDQ-Institutsseminar
KKeine Bearbeitungszusammenfassung
Keine Bearbeitungszusammenfassung
 
(3 dazwischenliegende Versionen von einem anderen Benutzer werden nicht angezeigt)
Zeile 1: Zeile 1:
{{Termin
{{Termin
|datum=2021/06/25 14:00:00
|datum=2021/06/25 14:00:00
|raum=https://sdqweb.ipd.kit.edu/wiki/Institutsseminar/Microsoft_Teams
|online=https://sdqweb.ipd.kit.edu/wiki/Institutsseminar/Microsoft_Teams
}}
}}
Mit Ralf und Anne abgesprochen: Vorträge finden parallel statt. (BA Julian Roßkothen in einer Session mit Anne und die anderen beiden Arbeiten in einer parallelen Session mit Ralf.)

Aktuelle Version vom 14. Januar 2022, 13:15 Uhr

Termin (Alle Termine)
Datum Freitag, 25. Juni 2021
Uhrzeit 14:00 – 16:00 Uhr (Dauer: 120 min)
Ort
Webkonferenz https://sdqweb.ipd.kit.edu/wiki/Institutsseminar/Microsoft Teams
Vorheriger Termin Fr 25. Juni 2021
Nächster Termin Fr 2. Juli 2021
Die Dauer dieses Termins beträgt derzeit 120 Minuten. Bitte ggf. einen weiteren Raum reservieren und den Termin auf zwei Räume aufteilen. Dazu unter Termine eine zusätzliche Terminseite anlegen und die Vorträge neu zuweisen.

Termin in Kalender importieren: iCal (Download)

Vorträge

Vortragende(r) Julian Roßkothen
Titel Analyse von KI-Ansätzen für das Trainieren virtueller Roboter mit Gedächtnis
Vortragstyp Bachelorarbeit
Betreuer(in) Daniel Zimmermann
Vortragsmodus
Kurzfassung In dieser Arbeit werden mehrere rekurrente neuronale Netze verglichen.

Es werden LSTMs, GRUs, CTRNNs und Elman Netze untersucht. Die Netze werden dabei untersucht sich einen Punkt zu merken und anschließend nach dem Punkt mit einem virtuellen Roboterarm zu greifen.

Bei LSTM, GRU und Elman Netzen wird auch untersucht wie die Netze die Aufgabe lösen, wenn jedes Neuron nur auf den eigenen Speicher zugreifen kann.

Dabei hat sich herausgestellt, dass LSTMs und GRUs deutlich besser bei den Experimenten bewertet werden als CTRNNs und Elman Netze. Außerdem werden die Rechenzeit und der Zusammenhang zwischen der Anzahl der zu trainierenden Parameter und der Ergebnisse der Experimente verglichen.

Vortragende(r) Lukas Bach
Titel Automatically detecting Performance Regressions
Vortragstyp Masterarbeit
Betreuer(in) Robert Heinrich
Vortragsmodus
Kurzfassung One of the most important aspects of software engineering is system performance. Common approaches to verify acceptable performance include running load tests on deployed software. However, complicated workflows and requirements like the necessity of deployments and extensive manual analysis of load test results cause tests to be performed very late in the development process, making feedback on potential performance regressions available much later after they were introduced.

With this thesis, we propose PeReDeS, an approach that integrates into the development cycle of modern software projects, and explicitly models an automated performance regression detection system that provides feedback quickly and reduces manual effort for setup and load test analysis. PeReDeS is embedded into pipelines for continuous integration, manages the load test execution and lifecycle, processes load test results and makes feedback available to the authoring developer via reports on the coding platform. We further propose a method for detecting deviations in performance on load test results, based on Welch's t-test. The method is adapted to suit the context of performance regression detection, and is integrated into the PeReDeS detection pipeline. We further implemented our approach and evaluated it with an user study and a data-driven study to evaluate the usability and accuracy of our method.

Vortragende(r) Jan Wittler
Titel Derivation of Change Sequences from State-Based File Differences for Delta-Based Model Consistency
Vortragstyp Masterarbeit
Betreuer(in) Timur Sağlam
Vortragsmodus
Kurzfassung In view-based software development, views may share concepts and thus contain redundant or dependent information. Keeping the individual views synchronized is a crucial property to avoid inconsistencies in the system. In approaches based on a Single Underlying Model (SUM), inconsistencies are avoided by establishing the SUM as a single source of truth from which views are projected. To synchronize updates from views to the SUM, delta-based consistency preservation is commonly applied. This requires the views to provide fine-grained change sequences which are used to incrementally update the SUM. However, the functionality of providing these change sequences is rarely found in real-world applications. Instead, only state-based differences are persisted. Therefore, it is desirable to also support views which provide state-based differences in delta-based consistency preservation. This can be achieved by estimating the fine-grained change sequences from the state-based differences.

This thesis evaluates the quality of estimated change sequences in the context of model consistency preservation. To derive such sequences, matching elements across the compared models need to be identified and their differences need to be computed. We evaluate a sequence derivation strategy that matches elements based on their unique identifier and one that establishes a similarity metric between elements based on the elements’ features. As an evaluation baseline, different test suites are created. Each test consists of an initial and changed version of both a UML class diagram and consistent Java source code. Using the different strategies, we derive and propagate change sequences based on the state-based difference of the UML view and evaluate the outcome in both domains. The results show that the identity-based matching strategy is able to derive the correct change sequence in almost all (97 %) of the considered cases. For the similarity-based matching strategy we identify two reoccurring error patterns across different test suites. To address these patterns, we provide an extended similarity-based matching strategy that is able to reduce the occurrence frequency of the error patterns while introducing almost no performance overhead.

Neuen Vortrag erstellen

Hinweise

Mit Ralf und Anne abgesprochen: Vorträge finden parallel statt. (BA Julian Roßkothen in einer Session mit Anne und die anderen beiden Arbeiten in einer parallelen Session mit Ralf.)