Doctoral Thesis
Refine
Document Type
- Doctoral Thesis (54) (remove)
Is part of the Bibliography
- yes (54)
Institute
- ESB Business School (18)
- Informatik (18)
- Life Sciences (7)
- Technik (7)
- Texoversum (3)
Publisher
- Universität Tübingen (6)
- University of Portsmouth (3)
- Universität Stuttgart (3)
- Fraunhofer Verlag (2)
- Gabler (2)
- Metropolis-Verlag (2)
- Springer (2)
- Stellenbosch University (2)
- University of the West of Scotland (2)
- Universität Hohenheim (2)
Um wirtschaftlichen Wohlstand dauerhaft zu sichern, wurde bereits vor einigen Jahren die "Bildungsrepublik Deutschland" ausgerufen. Doch eine Erhöhung der öffentlichen Ausgaben für Bildung ist ohne passendes Konzept keine Lösung. Ein zentraler Faktor darf nicht in Vergessenheit geraten: Die Qualität eines Bildungswesens steht und fällt mit der Qualität der Lehrer. Cathrin Sikor analysiert die bestehende Struktur von Angebot und Nachfrage im Lehrerarbeitsmarkt und beschreibt, welche Auswirkungen dies auf Lehrerqualität hat. Während sich die Rahmenbedingungen für Berufswahl und Karriere in anderen Bereichen deutlich verändert haben, gleicht das Lehramt im Wesentlichen seiner im Zuge der Industrialisierung entstandenen Form. Ist es an der Zeit für einen neuen Lehrerberuf?
The digital age makes it possible to be globally networked at any time. Digital communication is therefore an important aspect of today’s world. Hence, the further development and expansion of this is becoming increasingly important. Even within a wireless system, copper channels are important as part of the overall network. Given the need to keep pushing at the current limitations, careful design of the cables in connection with an adapted coding of the bits is essential to transmit more and more data.
One of the most popular and widespread cabling technologies is symmetrical copper cabling [1, pp. 8-15]. It is also known as Twisted Pair and it is of immense importance for the cabling of communication networks.
At the time of writing this thesis, data rates of up to 10 GBit/s over a transmission distance of 100 m and 40 GBit/s over a transmission distance of 30 m are standardized for symmetrical copper cabling [2]. Other lengths are not standardized. Short lengths in particular are of great interest for copper cables, because copper cables are usually used for short distances, such as between computers and the campus network or within data centres.
This work has focused on the transmission of higher order Pulse Amplitude Modulation and the associated transmission performance. The central research question is:“how well can we optimize the transmission technique in order to be able to maximise the data bandwidth over Ethernet cable and, given that remote powering is also a significant application of these cables, how much will the resulting heating affect this transmission and what can be done to mitigate that?”
To answer this question, the cable parameters are first examined. A series of spectral measurements, such as Insertion Loss, Return Loss, Near End Crosstalk and Far End Crosstalk, provide information about the electromagnetic interference and the influence of the ohmic resistance on the signal. Based on these findings, the first theoretical statements and calculations can be made. In the next step, data transmissions over different transmission lengths are realized. The examination of the eye diagrams of the different transmission approaches ultimately provides information about the signal quality of the transmissions. An overview of the maximum transmission rate depending on the transmission distance shows the potential for different applications.
Furthermore, the simultaneous transmission of energy and data is a significant advantage of copper. However, the resulting heat development has an influence on the data transmission. Therefore, the influence of the ambient temperature of cables is investigated in the last part and changes in the signal quality are clarified.
Ever since the 1980s, researchers in computer science and robotics have been working on making autonomous cars. Due to recent breakthroughs in research and devel- opment, such as the Bertha Benz Project [ZBS+14], the goal of fully autonomous vehicles seems closer than ever before. Yet a lot of questions remain unanswered. Especially now that the automotive industry moves towards autonomous systems in series production vehicles, the task of precise localization has to be solved with automotive grade sensors and keep memory and processing consumption at a mini- mum. This thesis investigates the Simultaneous Localization and Mapping (SLAM) prob- lem for autonomous driving scenarios on a parking lot using low cost automotive sensors. The main focus is herby devoted to the RAdio Detection And Ranging (RADAR) sensor, which has not been widely analyzed in an autonomous driving scenario so far, even though they are abundant in the automotive industry for ap- plications such as Adaptive Cruise Control (ACC). Due to the high noise floor, the radar sensor has widely been disregarded in the Intelligent Transportation Systems and Robotics communities with regards to SLAM applications. However in this thesis, it is shown that the RADAR sensor proves to be an affordable, robust and precise sensor, when modeling its physical properties correctly. In this regard, a GraphSLAM based framework is introduced, which extracts features from the RADAR sensor and generates an optimized map of the surroundings using the RADAR sensor alone. This framework is used to enable crowd based localization, which is not limited to the RADAR sensor alone. By integrating an automotive Light Detection and Ranging (LiDAR) and stereo camera sensor, a robust and precise localization system can be built that that is suitable for autonomous driving even in complex parking lot scenarios. It it is thereby shown that the RADAR sensor is strongly contributing to obtaining good results in a sensor fusion setup. These results were obtained on an extensive dataset on a parking lot, which has been recorded over the course of several months. It contains different weather conditions, different configurations of parked cars and a multitude of different trajectories to validate the approaches described in this thesis and to come to the conclusion that the RADAR sensor is a reliable sensor in series autonomous driving systems, both in a multi sensor framework and as a single component for localization.
In today’s marketplace, the consumption of luxury goods is at a peak due to increasing global wealth and low interest rates, resulting in a vast supply of goods and services to which customer experiences are more relevant than ever before. One of the most recent developments in this field shows that consumers no longer simply purchase a product or service based on the fact sheet; they are also interested in the experience around the product. Successful brands must develop and maintain individual images to sustain their competitive advantage and build brand equity that is beneficial for customers and firms. Ideally, these will lead to satisfaction and loyalty between a brand, its products, and its customers. Existing research about brand experience and brand equity has mainly focused on functional aspects, which seem to differ for high-value luxury goods. Most studies have focused on industries like retail and fashion brands, sampling university students or visitors to shopping malls, and some have even mixed different types of industries together. This underpins the need for research within a single luxury industry with actual luxury customers who have a solid background with brand experiences.
The purpose of this study was to explore the brand experience spectrum within the automotive industry in Germany, particularly in the affordable luxury sport car sector. Identifying the factors and components that constitute, influence, or leverage/drive a brand experience from their perspective was a clear aim of the study. To achieve this, the study collected data from indepth interviews with German (n=60) respondents who had experience with affordable and luxury sport cars. The conceptual framework was based on two empirically tested models guiding this exploratory consumer research. The first model to build on was the consumerbased brand equity model, empirically tested by Çifci et al. (2016) and Nam et al. (2011). The second conceptual framework was Lemon and Verhoef’s (2016) customer journey model consisting of relevant touchpoints along the following three stages: pre-purchase, purchase, and post-purchase.
The findings of the research demonstrate that, although the six brand equity concepts – brand awareness, physical quality, staff behaviour, self-congruence, brand identification, and lifestyle – are broadly applicable in understanding customer experience in the affordable luxury car industry, the content of these dimensions differs from that suggested by the previous authors. The research established that cognitive and affective (or symbolic) components build the foundation of customer brand experience and supports Çifci et al.’s (2016) and Nam et al.’s (2011) study results. The study also identified brand trust as an important and highly relevant concept for customer brand experience in the luxury automotive car industry. Brand trust influences customer satisfaction and loyalty, therefore improving and complementing the existing model. Furthermore, the study confirmed Lemon and Verhoef’s (2016) process model of the customer journey and experience; however, it suggested two different customer journeys depending on the customers’ previous experience (first-time and experienced buyers). The differences between the two groups and the relevance of the journey touchpoints within the three purchase stages vary significantly in terms and are distinct. Identified key touchpoints for both groups are the contact to a dealer as well as information gathering online. Differences have been found in the length of purchase stages and across the customer journey. The study highlights the importance of trust, identification, and product quality for customer brand experience. Moreover, the findings of this study complement the brand equity model of Çifci et al. (2016) by adding the new concept of trust, which is highly relevant. The current knowledge is complemented by a new understanding and mapping of the customer journey for luxury sports cars in Germany. This study can assist practitioners and managers by providing a compass indicating which touchpoints are relevant to which customer group. Social value can be achieved by encouraging interactions between brand and consumer (e.g. central product launch events) and through brand-oriented interactions among consumers (e.g. dealer events, clubs, or communities). Customers are motivated to express their distinctiveness through product experience and brand identification (belonging/distinction) and to develop a loyal link to brands.
Die Globalisierung hat die Emergenz neuer Formen gesellschaftlicher Steuerung vorangetrieben. Es entstehen sowohl neue Formen von globalen Regeln als auch neue Akteurskonstellationen zur Setzung und Durchsetzung dieser Regeln. Politische, wirtschaftliche und gesellschaftliche Akteure – internationale Organisationen, transnationale Unternehmen und Nichtregierungsorganisationen – gewinnen an Einfluss.
In diesem Kontext werden zunehmend Multistakeholder-Dialoge initiiert, in denen sich relevante Akteure aus Politik, Wirtschaft und Gesellschaft organisieren, um Lösungsansätze für globale Probleme u.a. durch die Erarbeitung von Richtlinien und Standards zu entwickeln. Diese Formen gesellschaftlicher Regulierung zeichnen sich dadurch aus, dass neue Organisationsstrukturen und Verfahrensregeln implementiert, neue Rollen gelernt und neue Akteure integriert werden müssen.
In diesem Buch werden die Governancestrukturen von Multistakeholder-Dialogen zur Führung, Steuerung und Kontrolle solcher Kooperationsprojekte analysiert. Ein aktuelles Beispiel für ein transnationales und transkulturelles Kooperationsprojekt dieser Art ist der von der ‚International Organization for Standardization‘ (ISO) initiierte Prozess zur Erarbeitung einer Norm zur gesellschaftlichen Verantwortung von Organisationen (‚Social Responsibility‘). Die im November 2010 veröffentlichte ISO 26000-Norm richtet sich an alle Arten von Organisationen im öffentlichen und gemeinnützigen Sektor und in der Privatwirtschaft – weltweit und unabhängig von ihrer Größe. Dieser Multistakeholder-Dialog wird theoretisch rekonstruiert und empirisch analysiert. Die theoretische Perspektive ist bestimmt durch eine kulturalistisch informierte Governanceökonomik und -ethik, die auf der Basis eines verallgemeinerten Stakeholderbegriffs operiert. Die empirische Analyse konzentriert sich auf die Mikrogovernance zur Steuerung deliberativer Multistakeholder-Dialoge.
Die interdisziplinär angelegte Studie, ihr argumentativer Gang und ihre Ergebnisse sind sowohl für die strategische Führung von Unternehmen als auch für die Gestaltung politischer Prozesse von großem Interesse. Sie leistet einen Beitrag zur aktuellen gesellschaftlichen Diskussion um die Verantwortlichkeit und Nachhaltigkeit von Unternehmen in der globalisierten Gesellschaft.
Knowledge is an important resource, whose transfer is still not completely understood. The underlying belief of this thesis is that knowledge cannot be transferred directly from one person to another but must be converted for the transfer and therefore is subject to loss of knowledge and misunderstanding. This thesis proposes a new model for knowledge transfer and empirically evaluates this model. The model is based on the belief that knowledge must be encoded by the sender to transfer it to the receiver, who has to decode the message to obtain knowledge.
To prepare for the model this thesis provides an overview about models for knowledge transfer and factors that influence knowledge transfer. The proposed theoretical model for knowledge transfer is implemented in a prototype to demonstrate its applicability. The model describes the influence of the four layers, namely code, syntactic, semantic, and pragmatic layers, on the encoding and decoding of the message. The precise description of the influencing factors and the overlapping knowledge from sender and receiver facilitate its implementation.
The application area of the layered model for knowledge transfer was chosen to be business process modelling. Business processes incorporate an important knowledge resource of an organisation as they describe the procedures for the production of products and services. The implementation in a software prototype allows a precise description of the process by adding semantic to the simple business process modelling language used.
This thesis contributes to the body of knowledge by providing a new model for knowledge transfer, which shows the process of knowledge transfer in greater detail and highlights influencing factors. The implementation in the area of business process modelling reveals the support provided by the model. An expert evaluation indicates that the implementation of the proposed model supports knowledge transfer in business process modelling. The results of the qualitative evaluation are supported by the findings of a qualitative evaluation, performed as a quasi-experiment with a pre-test/post-test design and two experimental groups and one control group. Mann-Whitney U tests indicated that the group that used the tool that implemented the layered model performed significantly better in terms of completeness (the degree of completeness achieved in the transfer) in comparison with the group that used a standard BPM tool (Z = 3.057, p = 0.002, r = 0.59) and the control group that used pen and paper (Z = 3.859, p < 0.001, r = 0.72). The experiment indicates that the implementation of the layered model supports the creation of a business process and facilitates a more precise representation.
Data collected from internet applications are mainly stored in the form of transactions. All transactions of one user form a sequence, which shows the user´s behaviour on the site. Nowadays, it is important to be able to classify the behaviour in real time for various reasons: e.g. to increase conversion rate of customers while they are in the store or to prevent fraudulent transactions before they are placed. However, this is difficult due to the complex structure of the data sequences (i.e. a mix of categorical and continuous data types, constant data updates) and the large amounts of data that are stored. Therefore, this thesis studies the classification of complex data sequences. It surveys the fields of time series analysis (temporal data mining), sequence data mining or standard classification algorithms. It turns out that these algorithms are either difficult to be applied on data sequences or do not deliver a classification: Time series need a predefined model and are not able to handle complex data types; sequence classification algorithms such as the apriori algorithm family are not able to utilize the time aspect of the data. The strengths and weaknesses of the candidate algorithms are identified and used to build a new approach to solve the problem of classification of complex data sequences. The problem is thereby solved by a two-step process. First, feature construction is used to create and discover suitable features in a training phase. Then, the blueprints of the discovered features are used in a formula during the classification phase to perform the real time classification. The features are constructed by combining and aggregating the original data over the span of the sequence including the elapsed time by using a calculated time axis. Additionally, a combination of features and feature selection are used to simplify complex data types. This allows catching behavioural patterns that occur in the course of time. This new proposed approach combines techniques from several research fields. Part of the algorithm originates from the field of feature construction and is used to reveal behaviour over time and express this behaviour in the form of features. A combination of the features is used to highlight relations between them. The blueprints of these features can then be used to achieve classification in real time on an incoming data stream. An automated framework is presented that allows the features to adapt iteratively to a change in underlying patterns in the data stream. This core feature of the presented work is achieved by separating the feature application step from the computational costly feature construction step and by iteratively restarting the feature construction step on the new incoming data. The algorithm and the corresponding models are described in detail as well as applied to three case studies (customer churn prediction, bot detection in computer games, credit card fraud detection). The case studies show that the proposed algorithm is able to find distinctive information in data sequences and use it effectively for classification tasks. The promising results indicate that the suggested approach can be applied to a wide range of other application areas that incorporate data sequences.
Im Fokus der Arbeit steht die Unterstützung der Stentgraftauswahl bei endovaskulärer Versorgung eines infrarenalen Aortenaneurysmas. Im Rahmen der Arbeit wurde eine Methode zur Auswertung von Ergebnissen einer Finite Elemente-Analyse zum Stentgraftverhalten konzipiert, implementiert und im Rahmen einer deutschlandweiten Benutzerstudie mit 16 Chirurgen diskutiert. Die entwickelte Mensch-Maschine-Schnittstelle ermöglicht dem Gefäßmediziner eine interaktive Analyse berechneter Fixierungskräfte und Kontaktzustände mehrerer Stentgrafts im Kontext mit dem zu behandelnden Aortenabschnitt. Die entwickelte Methode ermöglicht eine tiefergehende Auseinandersetzung der Mediziner mit numerischen Simulationen und Stentgraftbewertungsgrößen. Hierdurch konnte im Rahmen der Benutzerstudie das Einsatzpotenzial numerischer Simulationen zur Unterstützung der Stentgraftauswahl ermittelt und eine Anforderungsspezifikation an ein System zur simulationsbasierten Stentgraftplanung definiert werden. Im Ergebnis wurde als wesentliches Einsatzpotenzial die Festlegung eines Mindestmaßes an Überdimensionierung, die Optimierung der Schenkellänge von bifurkativen Stentgrafts sowie der Vergleich unterschiedlicher Stentgraftdesigns ermittelt. Zu den wesentlichen Funktionen eines Systems zur simulationsbasierten Stentgraftauswahl gehören eine Übersichtskarte zu farbkodiertem Migrationsrisiko pro Stentgraft und Landungszone, die Visualisierung des Abdichtungszustandes der Stentkomponenten sowie die Darstellung von Stentgraft- und Gefäßdeformationen im 3D-Modell.
Este trabajo se enmarca dentro del vasto contexto de Ciudades Inteligentes, y se centra en el área de la conducción inteligente de vehículos, tanto en zonas urbanas como interurbanas, mediante la recogida de datos en tiempo real, medidos con sensores, por parte de los propios conductores, así como de datos capturados mediante simulación.
El objetivo de este trabajo es doble. Por un lado, el estudio y aplicación de las diferentes técnicas y métodos de detección de valores atípicos en bases de datos multivariantes, además de una comparativa entre ellos mediante las pruebas llevadas a cabo con datos de tráfico real. Y por otro lado, establecer una relación entre las situaciones anómalas de tráfico, como puedan ser atascos o accidentes, con los valores atípicos multivariantes encontrados.
La detección de valores atípicos representa una de las tareas más importantes a la hora de realizar cualquier análisis de datos, sea cual sea el dominio o área de estudio, ya que entre sus funciones primordiales se encuentra el descubrir información útil, que resulta de gran valor, y que por lo general queda oculta por la alta dimensión de los datos.
Con el uso de mecanismos de detección de valores atípicos junto con métodos de clasificación supervisada, se va a poder llevar a cabo el reconocimiento de elementos de la infraestructura vial urbana como pueden ser rotondas, pasos de cebra, cruces o semáforos.