Refine
Document Type
- Conference proceeding (72)
- Book chapter (14)
- Journal article (13)
- Anthology (10)
- Doctoral Thesis (8)
- Book (7)
- Patent / Standard / Guidelines (2)
- Report (1)
- Working Paper (1)
Has full text
- no (128) (remove)
Is part of the Bibliography
- yes (128)
Institute
- Informatik (128) (remove)
Publisher
Erfolg durch Kooperation
(2009)
Relationship Marketing (RM) presumes trust as an important antecedent for the performance of interfirm relationships. Current research is dominated by an interpersonal perspective. In this research tack, trust chiefly emerges as a result of interpersonal relationships. But multiple risks arise if customer trust rests solely on elements inextricably linked to single representatives. Hence, this paper evaluates the impact of organizational capabilities and the moderating role of customer preferences on the trust creation process. The framework presented here is tested cross-industry on 220 customers for IT solutions. The results offer significant insight into the effectiveness of individual and organizational RM strategies.
Vertrauen ist eine wesentliche Ressource für die Zusammenarbeit zwischen Anbietern und Kunden. In der postmodernen Gesellschaft sind beide Seiten auf Kooperation angewiesen. Ohne Vertrauen führen gemeinsame Beziehungen jedoch selten zu den erwünschten Resultaten. Alexander Rossmann zeigt auf, wie sich das Vertrauen von Kunden stimulieren lässt und welche Verhaltensweisen zu vermeiden sind. Dabei werden personale und organisationale Vertrauensstrategien konzeptionell entwickelt und am Beispiel der IT-Branche empirisch untersucht. Eine Analyse der Auswirkungen von Vertrauen bietet differenzierte Einblicke in die Chancen und Risiken von Vertrauen aus Anbieter- und Kundenperspektive.
In this presentation the audience will be: (a) introduced to the aims and objectives of the DBTechNet initiative, (b) briefed on the DBTech EXT virtual laboratory workshops (VLW), i.e. the educational and training (E&T) content which is freely available over the internet and includes vendor-neutral hands-on laboratory training sessions on key database technology topics, and (c) informed on some of the practical problems encountered and the way they have been addressed. Last but not least, the audience will be invited to consider incorporating some or all of the DBTech EXT VLW content into their higher education (HE), vocational education and training (VET), and/or lifelong learning/training type course curricula. This will come at no cost and no commitment on behalf of the teacher/trainer; the latter is only expected to provide his/her feedback on the pedagogical value and the quality of the E&T content received/used.
Suppliers need to improve their relational capabilities if they are to enhance customer trust. Debate about such capabilities is dominated by an interpersonal approach. This paper provieds novel marketing options by expanding insights into alternative types of relational capabilities. Furthermore, the moderating role of customer preferences on the effectiveness of relational capabilities is evaluated.
Turning complainers into fans : towards a framework for customer services in social media channels
(2012)
In recent years, marketing scholars have invested heavily in exploring the role of social media in marketing theory and practice. One valuable strategy for using social media in marketing communication is to provide customer services in applications like Facebook or Twitter. This paper evaluates a) the concept of perceived service quality in different service channels and b) the impact customer service strategies have on customer loyalty, word of mouth communication, and cross-sell preferences. The framework presented here is tested cross-channel against data collected from the customer service department of a large telecommunication provider. The results elucidate the effectiveness of customer service strategies in different channels.
The impact of stress of every human being has become a serious problem. Reported impact on persons are a higher rate or health disorders like heart problems, obesity, asthma, diabetes, depressions and many others. An individual in a stressful situation has to deal with altered cognition as well as an affected decision making skill and problem solving. This could lead to a higher risk for accidents in dynamic environments such as automotive. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives or computes the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence as well as recommend driving behavior to decrease stress influenced driving as well as improve road safety.
This thesis studies concurrency control and composition of transactions in computing environments with long living transactions where local data autonomy of transactions is indispensable. This kind of computing architecture is referred to as a Disconnected System where reads are segregated -disconnected- from writes enabling local data autonomy. Disconnecting reads from writes is inspired by Bertrand Meyer's "Command Query Separation" pattern. This thesis provides a simple yet precise definition for a Disconnected System with a focus on transaction management. Concerning concurrency control, transaction management frameworks implement a'one concurrency control mechanism fits all needs strategy'. This strategy, however, does not consider specific characteristics of data access. The thesis shows the limitations of this strategy if transaction load increases, transactions are long lived, local data autonomy is required, and serializability is aimed at isolation level. For example, in optimistic mechanisms the number of aborts suddenly increases if load increases. In pessimistic mechanisms locking causes long blocking times and is prone to deadlocks. These findings are not new and a common solution used by database vendors is to reduce the isolation. This thesis proposes the usage of a novel approach. It suggests choosing the concurrency control mechanism according to the semantics of data access of a certain data item. As a result a transaction may execute under several concurrency control mechanisms. The idea is to introduce lanes similar to a motorway where each lane is dedicated to a certain class of vehicle with the same characteristics. Whereas disconnecting reads and writes sets the traffic's direction, the semantics of data access defines the lanes. This thesis introduces four concurrency control classes capturing the semantics of data access and each of them has an associated tailored concurrency control mechanism. Class O (the optimistic class) implements a first-committer-wins strategy, class R (the reconciliation class) implements a first-n-committers-win strategy, class P (the pessimistic class) implements a first-reader-wins strategy, and class E (the escrow class) implements a first-n-readers-win strategy. In contrast to solutions that adapt the concurrency control mechanism during runtime, the idea is to classify data during the design phase of the application and adapt the classification only in certain cases at runtime. The result of the thesis is a transaction management framework called O|R|P|E. A performance study based on the TPC-C benchmark shows that O|R|P|E has a better performance and a considerably higher commit rate than other solutions. Moreover, the thesis shows that in O|R|P|E aborts are due to application specific limitations, i.e., constraint violations and not due to serialization conflicts. This is a result of considering the semantics.
In this work, a web-based software architecture and framework for management and diagnosis of large amounts of medical data in an ophthalmologic reading center is proposed. Data management for multi-center studies requires merging of standing data and repeatedly gathered clinical evidence such as vital signs and raw data. If ophthalmologic questions are involved the data acquisition is often provided by non-medical staff at the point of care or a study center, whereas the medical finding is mostly provided by an ophthalmologist in a specialized reading center. The study data such as participants, cohorts and measured values are administrated at a single data center for the entire study. Since a specialized reading center maintains several studies, the medical staff must learn the different data administration for the different data center. With respect to the increasing number and sizes of clinical studies, two aspects must be considered. At first, an efficient software framework is required to support the data management, processing and diagnosis by medical experts at the reading center. In the second place, this software needs a standardized user-interface that has not to be trained/taylore /adapted for each new study. Furthermore different aspects of quality and security controls have to be included. Therefore, the objective of this work is to establish a multi purpose ophthalmologic reading center, which can be connected to different data centers via configurable data interfaces in order to treat various topics simultaneously.
Was machen dauerhaft erfolgreiche Unternehmen anders? Armin Roth identifiziert die Erfolgsfaktoren und entwickelt daraus das Konzept des ganzheitlichen Performance Managements: Es integriert die fünf Teildisziplinen Corporate Performance Management, Business Process Management, Projekt- und Mitarbeitermanagement sowie Management von Ganzheitlichkeit und Langfristigkeit in ein System. Unternehmen steigern dadurch ihre Leistungs- und Wettbewerbsfähigkeit nachhaltig. Management Cockpits sorgen dabei für eine zielorientierte Steuerung auf strategischer und operativer Ebene.
Erstmalig wird dieser integrative Ansatz in diesem Buch vorgestellt.
Der Beitrag zeigt, welche grundlegenden Managementmethoden und -instrumente sich identifizieren lassen, um den Unterschied zwischen dauerhaft erfolgreichen und nicht erfolgreichen Unternehmen zu erklären. In diesem Konext wird ein Ansatz für einen Leistungsmanagement-Gesamtprozess entwickelt, in dem die zentralen Problemquellen bei der Einführung von Performance Management eingeordnet und erläutert werden.
Der Beitrag stellt ein zentrales Denkraster für die Konzeption eines ganzheitlichen und langfristigen Performance-Management vor. Darin werden fünf gleichberechtigte Teildisziplinen erläutert, die, in ihrer Ausprägung und Kombination, die Themen- und Gestaltungskomplexität eines Leistungsmanagements aufzeigen. Ziel ist es, durch eine leicht verständliche Systematik das komplexe Thema Performance-Management und seine Zusammenhänge begreifbar und kommunizierbar zu machen, ohne dabei ein allgemeingültiges Rezept zu liefern.
This paper addresses the following four research questions: 1. How should customer service quality in social media channels be conceptualized on multiple levels? 2. Which aspects of customer service quality are important in enhancing customer satisfaction? 3. What outcomes are effected by customer service quality and customer satisfaction? 4. How effective are customer services delivered through social media channels (as compared to customer services delivered through other channels)?
The Third International Conference on Data Analytics (DATA ANALYTICS 2014), held on August 24 - 28, 2014 - Rome, Italy, continued the inaugural event on fundamentals in supporting data analytics, special mechanisms and features of applying principles of data analytics, application oriented analytics, and target-area analytics.
Processing of terabytes to petabytes of data, or incorporating non-structural data and multistructured data sources and types require advanced analytics and data science mechanisms for both raw and partially-processed information. Despite considerable advancements on high performance, large storage, and high computation power, there are challenges in identifying, clustering, classifying, and interpreting of a large spectrum of information.
A behavior marker for measuring non-technical skills of software professionals : an empirical study
(2015)
Managers recognize that software development teams need to be developed. Although technical skills are necessary, non-technical (NT) skills are equally, if not more, necessary for project success. Currently, there are no proven tools to measure the NT skills of software developers or software development teams. Behavioral markers (observable behaviors that have positive or negative impacts on individual or team performance) are successfully used by airline and medical industries to measure NT skill performance. This research developed and validated a behavior marker tool rated video clips of software development teams. The initial results show that the behavior marker tool can be reliably used with minimal training.
Das digitale Unternehmen erfordert neue Konzepte des Digital Enterprise Computing. Dieses umfasst eine interdisziplinäre Verbindung von Vorgehensweisen aus der Informatik, der Ökonomie und weiteren relevanten Wissenschaftsdisziplinen. Neue Architekturen mit integrierten Mobility-Systemen, kollaborativen Geschäftsprozessen, Big Data und Cloud-Ökosystemen beflügeln aktuelle und künftige Geschäftsstrategien und machen die digitale Transformation zu neuen Geschäftsfeldern erst möglich. Dafür ist eine enge Kooperation verschiedener Partner aus Wissenschaft, Wirtschaft und Gesellschaft notwendig. Die Jahreskonferenz Digital Enterprise Computing positioniert die Gesellschaft für Informatik als wissenschaftlichen Mitveranstalter und vertieft Erfahrungen aus dem Arbeitskreis Enterprise Architecture Management der Fachgruppe Architekturen im Fachbereich Softwaretechnik der Gesellschaft für Informatik.
Im Fokus der Arbeit steht die Unterstützung der Stentgraftauswahl bei endovaskulärer Versorgung eines infrarenalen Aortenaneurysmas. Im Rahmen der Arbeit wurde eine Methode zur Auswertung von Ergebnissen einer Finite Elemente-Analyse zum Stentgraftverhalten konzipiert, implementiert und im Rahmen einer deutschlandweiten Benutzerstudie mit 16 Chirurgen diskutiert. Die entwickelte Mensch-Maschine-Schnittstelle ermöglicht dem Gefäßmediziner eine interaktive Analyse berechneter Fixierungskräfte und Kontaktzustände mehrerer Stentgrafts im Kontext mit dem zu behandelnden Aortenabschnitt. Die entwickelte Methode ermöglicht eine tiefergehende Auseinandersetzung der Mediziner mit numerischen Simulationen und Stentgraftbewertungsgrößen. Hierdurch konnte im Rahmen der Benutzerstudie das Einsatzpotenzial numerischer Simulationen zur Unterstützung der Stentgraftauswahl ermittelt und eine Anforderungsspezifikation an ein System zur simulationsbasierten Stentgraftplanung definiert werden. Im Ergebnis wurde als wesentliches Einsatzpotenzial die Festlegung eines Mindestmaßes an Überdimensionierung, die Optimierung der Schenkellänge von bifurkativen Stentgrafts sowie der Vergleich unterschiedlicher Stentgraftdesigns ermittelt. Zu den wesentlichen Funktionen eines Systems zur simulationsbasierten Stentgraftauswahl gehören eine Übersichtskarte zu farbkodiertem Migrationsrisiko pro Stentgraft und Landungszone, die Visualisierung des Abdichtungszustandes der Stentkomponenten sowie die Darstellung von Stentgraft- und Gefäßdeformationen im 3D-Modell.