Informatik
Refine
Document Type
- Conference proceeding (570)
- Journal article (199)
- Book chapter (62)
- Doctoral Thesis (18)
- Book (10)
- Anthology (10)
- Patent / Standard / Guidelines (2)
- Report (2)
- Working Paper (2)
Is part of the Bibliography
- yes (875)
Institute
- Informatik (875)
- Technik (2)
- ESB Business School (1)
Publisher
- Springer (205)
- Hochschule Reutlingen (104)
- IEEE (90)
- Gesellschaft für Informatik e.V (62)
- Elsevier (47)
- Association for Computing Machinery (38)
- IARIA (26)
- RWTH Aachen (15)
- De Gruyter (14)
- Association for Information Systems (12)
Social Media
(2011)
In jüngerer Zeit gewinnt die Nutzung des Internet für das Inbound Marketing zunehmend an Bedeutung. Dabei liegt der Fokus auf den so genannten Social Media Plattformen wie Facebook, YouTube, MySpace, XING, LinkedIn, Twitter, SlideShare und Posterous. Die Entwicklung dieser Medien ist auf eine Veränderung bei der Nutzung des Internet zurückzuführen, die häufig unter dem Schlagwort Web 2.0 zusammengefasst wird. Das gewandelte Mediennutzungsverhalten der Kunden induziert Chancen und Risiken für das Marketing.
Das Internet gewinnt für das Marketing zunehmend an Bedeutung. Dabei liegt der Fokus auf sogenannten Social-Media-Anwendungen wie Facebook, Twitter oder XING. Für Unternehmen stellt sich die Frage, ob das veränderte Mediennutzungsverhalten der Kunden eine neue Marketinglogik induziert. Eine aktuelle Untersuchung gibt Einblicke in die Chancen und Risiken, Anwendungsbedingungen und Kontextfaktoren für die Nutzung von Social Media im Marketing.
Relationship Marketing (RM) presumes trust as an important antecedent for the performance of interfirm relationships. Current research is dominated by an interpersonal perspective. In this research tack, trust chiefly emerges as a result of interpersonal relationships. But multiple risks arise if customer trust rests solely on elements inextricably linked to single representatives. Hence, this paper evaluates the impact of organizational capabilities and the moderating role of customer preferences on the trust creation process. The framework presented here is tested cross-industry on 220 customers for IT solutions. The results offer significant insight into the effectiveness of individual and organizational RM strategies.
Wo treffe ich meine Kunden? Was lerne ich aus dem Feedback meiner User? Wie messe ich Erfolg? Im Sozialnetzwerk muss man die richtigen Fragen stellen, sagt Internet-Forscher Prof. Alexander Rossmann. Seine Studie Auf der Suche nach dem Return on Social Media an der Uni St. Gallen sorgte einst für Furore.
Erfolg durch Kooperation
(2009)
This paper addresses the following four research questions: 1. How should customer service quality in social media channels be conceptualized on multiple levels? 2. Which aspects of customer service quality are important in enhancing customer satisfaction? 3. What outcomes are effected by customer service quality and customer satisfaction? 4. How effective are customer services delivered through social media channels (as compared to customer services delivered through other channels)?
Vertrauen ist eine wesentliche Ressource für die Zusammenarbeit zwischen Anbietern und Kunden. In der postmodernen Gesellschaft sind beide Seiten auf Kooperation angewiesen. Ohne Vertrauen führen gemeinsame Beziehungen jedoch selten zu den erwünschten Resultaten. Alexander Rossmann zeigt auf, wie sich das Vertrauen von Kunden stimulieren lässt und welche Verhaltensweisen zu vermeiden sind. Dabei werden personale und organisationale Vertrauensstrategien konzeptionell entwickelt und am Beispiel der IT-Branche empirisch untersucht. Eine Analyse der Auswirkungen von Vertrauen bietet differenzierte Einblicke in die Chancen und Risiken von Vertrauen aus Anbieter- und Kundenperspektive.
Suppliers need to improve their relational capabilities if they are to enhance customer trust. Debate about such capabilities is dominated by an interpersonal approach. This paper provieds novel marketing options by expanding insights into alternative types of relational capabilities. Furthermore, the moderating role of customer preferences on the effectiveness of relational capabilities is evaluated.
Die für Deutschland verfügbaren Studien zur Digitalen Transformation in klein- und mittelständischen Unternehmen (KMU) sind sich weitgehend einig. KMU tun sich mit dem Thema Digitalisierung schwer. Der vorliegende Beitrag diskutiert, weshalb KMU an der Digitalen Transformation scheitern und was dagegen getan werden kann.
The rise of digital technologies has become an important driver for change in multiple industries. Therefore, firms need to develop digital capabilities to manage the transformation process successfully. Prior research assumes that the development of a specific set of digital capabilities leads to higher digital maturity. However, a measurement framework for digital maturity does not exist in scholarly work. Therefore, this paper develops a conceptualization and measuremnent model for digital maturity.
Digital twins: a meta-review on their conceptualization, application, and reference architecture
(2022)
The concept of digital twins (DTs) is receiving increasing attention in research and management practice. However, various facets around the concept are blurry, including conceptualization, application areas, and reference architectures for DTs. A review of preliminary results regarding the emerging research output on DTs is required to promote further research and implementation in organizations. To do so, this paper asks four research questions: (1) How is the concept of DTs defined? (2) Which application areas are relevant for the implementation of DTs? (3) How is a reference architecture for DTs conceptualized? and (4) Which directions are relevant for further research on DTs? With regard to research methods, we conduct a meta-review of 14 systematic literature reviews on DTs. The results yield important insights for the current state of conceptualization, application areas, reference architecture, and future research directions on DTs.
Purpose – This paper aims to complement the current understanding about user engagement in electronic word-of-mouth (eWoM) communications across online services and product communities. It examines the effect of the senders’ prior experience with products and services, and their extent of acquaintance with other community members, on user engagement with the eWoM.
Design/methodology/approach – The study used a sample of 576 unique user postings from the corporate fan page of two German firms: a service community of a telecom provider and a product community of a car manufacturer. Multiple regression analysis is used to test the conceptual model.
Findings – Senders’ prior experience and acquaintance positively affect user engagement with eWoM, and these effects differ across communities for products and services and across their influence on “likes” and “comments”. The results also suggest that communities for products are orientated toward information sharing, while those discussing services engage in information building.
Research limitations/implications – This research explains mechanisms of user engagement with eWoM and opens directions for future research around motives, content and social media tools within the structures of online communities. The insights on information-handling dimensions of online tools and antecedents to their use contribute to the research on two prioritized topics by the Marketing Science Institute – "Measuring and
Communicating the Value of Online Marketing Activities and Investments" and "Leveraging Digital/Social/Mobile Technology".
Practical implications – This research offers insights for firms to leverage user engagement and facilitate eWoM generation through members who have a higher number of acquaintances or who have more experience with the product or service. Executives should concentrate their community engagement strategies on the identification and utilization of power users. The conceptualization and empirical test about the role of likes and comments will help social media managers to create and better capture value from their social media metrics.
Originality/value – The insights about the underlying factors that influence engagement with eWoM advance our understanding about the usage of online content.
We were able to identify a set of specific capabilities corporations need to develop in order to enhance brand love. Furthermore, the effects of most dynamic capabilities on brand love have a strong correlation to the degree of customer orientation. Other results are relevant concerning the proposed moderation and mediation hypotheses. Firstly, the impact of customer orientation on brand love is varied under specific market conditions, supporting our central moderation hypothesis (β = .259, p = .001). To be precise, the impact of customer orientation is strongest in markets that have low competitive differentiation in products and services. Other control variables like age, gender, or market form (B2B versus B2C) lead to no significant heterogeneity in the data set. Finally, mediation analyses show no significant “direct effect” of the existing DC constructs on brand love, supporting the mediating role of customer orientation.
Der vorliegende Artikel beleuchtet die grundsätzlichen Möglichkeiten der Integration von Funktionalitäten der sozialen Medien in Unternehmen. Darauf aufbauend wird Social Commerce als zentraler Gegenstand der Unternehmensführung hergeleitet. Dabei stehen der kundenseitige Kaufprozess und dessen Schnittstellen zu Kommunikationsinstrumenten des Social Webs im Vordergrund. Gezeigt wird die Beeinflussung des individuellen Kaufprozesses durch Social Media. Diese Wirkungsdynamiken sind nachfolgend die Grundlage für die Deskription von möglichen strategischen Einsatzfeldern und Bereichen des Social Commerce in der Unternehmensführung.
Customer services in the digital transformation: social media versus hotline channel performance
(2015)
Due to the digital transformation online service strategies have gained prominence in practice as well as in the theory of service management. This study examines the efficacy of different types of service channels in customer complaint handling. The theoretical framework, developed using complaint handling and social media literature, is tested against data collected from two different channels (hotline and social media) of a German telecommunication service provider. We contribute to the understanding of firm’s multichannel distribution strategy in two ways: a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels, and b) by testing the impact of complaint handling quality on key performance outcomes like customer loyalty, positive word-of-mouth, and cross purchase intentions.
The stimulation of user engagement has received significant attention in extant research. However, the theory of antecedents for user engagement with an initial electronic word-of-mouth (eWoM) communication is relatively less developed. In an investigation of 576 unique user postings across independent Facebook (FB) communities for two German firms, we contribute to the extant knowledge on user engagement in two different ways. First, we explicate senders’ prior usage experience and the extent of their acquaintance with other community members as the two key drivers of user engagement across a product and a service community. Second, we reveal that these main effects differ according to the type of community. In service communities, experience has a stronger impact on user engagement; whereas, in product communities, acquaintance is more important.
In recent years, the rise of the digital transformation received significant importance in Business-to-Business (B2B) research. Social media applications provide executives with a raft of new options. Consequently, interfaces to social media platforms have also been integrated into B2B salesforce applications, although very little is as yet known about their usage and general impact on B2B sales performance. This paper evaluates 1) the conceptualization of social media usage in a dyadic B2B relationship; 2) the effects of a more differentiated usage construct on customer satisfaction; 3) antecedents of social media usage on multiple levels; and 4) the effectiveness of social media usage for different types of customers. The framework presented here is tested cross-industry against data collected from dyadic buyer seller relationships in the IT service industry. The results elucidate the preconditions and the impact of social media usage strategies in B2B sales relations.
Enterprise Social Networks : Einführung in die Thematik und Ableitung relevanter Forschungsfelder
(2016)
Die Relevanz von Enterprise Social Networks (ESN) für den Arbeitsalltag in Wissensorganisationen steigt. Diese Netzwerke unterstützen die Kommunikation, Zusammenarbeit und das Wissensmanagement in Unternehmen. Der vorliegende Beitrag beinhaltet eine Einführung in das Themengebiet ESN und skizziert Einsatzmöglichkeiten, Potenziale und Herausforderungen. Er gibt einen Überblick zu wesentlichen Fachartikeln, die eine Übersicht zu Forschungsarbeiten im Bereich ESN beinhalten. Anschließend werden einzelne Forschungsbeiträge analysiert und weitere Forschungspotenziale abgeleitet. Dies führt zu acht Erfolg versprechenden Bereichen für die weitere Forschung: 1) Nutzerverhalten, 2) Effekte des Einsatzes von ESN, 3) Management, Leadership und Governance für ESN, 4) Wertbestimmung und Erfolgsmessung, 5) kulturelle Auswirkungen, 6) Architektur und Design von ESN, 7) Theorien, Forschungsdesigns und Methoden, sowie 8) weitere Herausforderungen in Bezug auf ESN. Der Beitrag charakterisiert diese Bereiche und formuliert exemplarisch offene Fragestellungen für die zukünftige Forschung.
Viele Unternehmen befassen sich in jüngster Zeit mit der Nutzung von Social Media für die interne Kommunikation und Zusammenarbeit. So genannte Enterprise Social Networks bieten integrierte Plattformen mit Profilen, Blogs, Gruppen- und Kommentarfunktionen für die unternehmensinterne Anwendung. Sehr häufig sind damit umfangreiche Investitionen verbunden. Die Budgets werden im Kern für die IT verwendet, "weiche Faktoren" bleiben häufig außen vor. Ein schwerer Fehler, wie aktuelle Marktstudien zeigen. Etliche der ambitionierten Projekte drohen daher zu scheitern.
Unternehmen befassen sich in jüngster Zeit verstärkt mit der Nutzung von Social Media in der internen Kommunikation und Zusammenarbeit. So genannte Enterprise Social Networks (ESN) bieten integrierte Plattformen mit Profilen, Blogs, gemeinsamer Dokumentenverwaltung, Wikis, Chats, Gruppen- und Kommentarfunktionen für die unternehmensinterne Anwendung. Sehr häufig sind damit umfangreiche Investitionen verbunden. Die Budgets werden im Kern für die IT verwendet – „weiche Faktoren“ bleiben häufig außen vor. Dies kann zu erheblichen Problemen bei der Akzeptanz entsprechender Plattformen führen. Daher sind weitere Maßnahmen im Bereich der Steuerung der Einführung und des Betriebs von ESN erforderlich, die sich unter dem Begriff der Governance zusammenfassen lassen. Das Konstrukt Governance bezieht sich auf Art und Umfang der Rollen und Aufgaben zur Steuerung der Nutzung von ESN. Der vorliegende Beitrag beleuchtet mögliche Governancemodelle für die Einführung und Weiterentwicklung von ESN. Die Resultate der vorliegenden Forschung wurden auf der Grundlage einer fundierten Literaturanalyse sowie der explorativen Befragung verantwortlicher Executives für die Nutzung von ESN in deutschen Großunternehmen erzielt. Dabei weisen die Implikationen der qualitativen Datenanalyse auf Zusammenhänge hin, die sich als Ausgangshypothesen für weitere Forschungsarbeiten nutzen lassen.
Social media usage in business-to-business sales : conceptualization, antecedents, and outcomes
(2015)
In recent years, the rise of social media received significant importance in marketing research. Social media applications now provide executives with a raft of new options. Consequently, interfaces to social media platforms have also been integrated into Business to-Business (B2B) salesforce applications, although very little is as yet known about their usage and general impact on B2B sales performance. This paper evaluates 1) the conceptualization of social media usage in a dyadic B2B relationship; 2) the effects of a more differentiated usage construct on customer satisfaction; 3) antecedents of social media usage on multiple levels; and 4) the effectiveness of social media usage for different types of customers. The framework presented here is tested cross-industry against data collected from dyadic buyer seller relationships in the IT service industry. The results elucidate the preconditions and the impact of social media usage strategies in B2B sales relations.
Die digitale Transformation bezieht sich auf die zunehmende Digitalisierung von Inhalten und Prozessen und die steigende Bedeutung digitaler Medien in Wirtschaft und Gesellschaft. Dabei wird der Wandel u. a. durch die Evolution in der Nutzung des Internets getrieben. Während in der Phase des so genannten Web 1.0 die Publikation und Verbreitung statischer Inhalte im Fokus stand, werden durch das Web 2.0 überwiegend Prozesse der dezentralen Erzeugung und einfachen Verbreitung von User Generated Content stimuliert. Unternehmen müssen auf diese Veränderungen reagieren, um die eigene Wettbewerbsfähigkeit nachhaltig abzusichern. Der vorliegende Beitrag konzentriert sich auf die Weiterentwicklung des Kundenservice. Dieser wurde in den zurückliegenden Jahren von vielen Unternehmen überwiegend als Kostenfaktor mit geringer strategischer Bedeutung eingestuft. Diese Sichtweise hat sich in der digitalen Transformation grundlegend geändert. Kunden können heute Mängel an Produkten und Dienstleistungen durch Foren und Social Media Kanäle sofort und mit hoher Reichweite adressieren. Unternehmen müssen auf den gleichen Kanälen reagieren, um die Multiplikation negativer Sichtweisen einzudämmen und Übertragungseffekte auf traditionelle Medien zu vermeiden. Gleichzeitig entstehen durch digitale Kanäle völlig neue Serviceangebote, die sich nachhaltig auf die unternehmerische Wettbewerbsfähigkeit auswirken. Der vorliegende Beitrag gibt zunächst einen Überblick zu wesentlichen Entwicklungslinien der digitalen Transformation. Auf dieser Grundlage werden die Perspektiven für Unternehmen zur Integration digitaler Medien in die eigene Wertschöpfungskette skizziert. Darüber hinaus steht v. a. die Veränderung des Kundenservice im so genannten Web 2.0 zur Diskussion. Ein Ausblick auf zukünftige Entwicklungen der Digitalisierung rundet den Beitrag entsprechend ab.
This paper investigates the impact of dynamic capabilities (DC) on brand love. From a resource-based view, there is little clarity vis-à-vis the specific capabilities that drive the ability to create brand love. This paper focuses on three research questions: Firstly, which dynamic capabilities are relevant for brand love? Secondly, how strong is the impact of certain dynamic capabilities on brand love? Thirdly, which conditions mediate and moderate the impact of specific dynamic capabilities on brand love? Data from a multi-method research approach have been used to itentify the specific capabilities that corporations need, to enhance brand love. Furthermore, a standardized online survey was conducted on marketing executives and evaluated by structural equation modeling. The results indicate, that customer expertise plays a major role in the relationship between dynamic capabilities and brand love. Furthermore, this relationship is more important in markets that have a low competitive differentiation in products and services.
Electronic word-of-mouth (eWoM) communication has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM.
This paper examines the efficacy of social media systems in customer complaint handling. The emergence of social media, as a useful complement and (possibly) a viable alternative to the traditional channels of service delivery, motivates this research. The theoretical framework, developed from literature on social media and complaint handling, is tested against data collected from two different channels (hotline and social media) of a German telecommunication services provider, in order to gain insights into channel efficacy in complaint handling. We contribute to the understanding of firm’s technology usage for complaint handling in two ways:
(a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels and (b) by comparing the impact of complaint handling quality on key performance outcomes such as customer loyalty, positive word-of-mouth, and crosspurchase intentions across traditional and social media channels.
The advent of chatbots in customer service solutions received increasing attention by research and practice throughout the last years. However, the relevant dimensions and features for service quality and service performance for chatbots remain quite unclear. Therefore, this research develops and tests a conceptual model for customer service quality and customer service performance in the context of chatbots. Additionally, the impact of the developed service dimensions on different customer relationship metrics is measured across different service channels (hotline versus chatbots). Findings of six independent studies indicate a strong main effect of the conceptualized service dimensions on customer satisfaction, service costs, intention to service reusage, word-of-mouth, and customer loyalty. However, different service dimensions are relevant for chatbots compared to a traditional service hotline.
Das Ziel dieser Arbeit ist, die Infrastruktur einer modernen Fahrzeug-zu Fahrzeug-Kommunikation auf ihre Sicherheit zu prüfen. Dazu werden die Sicherheitsstandards für die Funkkommunikation genauer beschrieben und anschließend mit möglichen Angriffsmodellen geprüft. Mit dem erläuterten Wissen der VANET Architektur werden verschiedene Angriffe verständlicher. Dadurch werden die Schwachstellen offengelegt und Gegenmaßnahmen an passenden Punkten in der Architektur verdeutlicht.
Many companies practice performance management in the framework of a heterogeneous, grown mix of numerous separate decisions, instruments, processes and systems and not in terms of a strategically and systematically planned management system. Due to the inefficiency of the above mentioned performance management style, a holistic and integrated approach is a key factor. Performance management must be able to meet central objectives and requirements and set the groundwork for long-term corporate success. This article presents a central approach of the conception of holistic and long-term performance management. The five equal part disciplines are illustrated and demonstrate the issue and composition complexity of a performance management due to their characteristics and combination. The objective of this article is to display and communicate the performance management issue and its context through an easily comprehensible system without following a general recipe.
Management nowadays is confronted by a variety of information originating from either internal or external sources. Thereby, the difficulty to focus on the relevant and company critical keyfigures information increases. In practice, information management is often a major weakness of efficient corporate management. That weakness is caused by the lack of a centralized, categorized and summarized presentation and analysis of strategy and decision-relevant information. Management cockpits, a kind of information center for managers, are an approach to meet the challenges of information management. They are a specific work environment for decision makers to get a quick and simple overview of the company’s economic situation. In the most completely equipped premises, the entire process is supported - from acquiring information, to analysis, decision-making, and communication. Use of management cockpits, a cross-functional, KPI-based and strategyoriented controlling and management process, can be successfully established in companies as well as the work of interdisciplinary management teams, which are supported. In order to provide these possibilities, the management cockpit is equipped with a range of functionalities that allow the structuring, categorization and management-adequate visualization of information along with extensive analysis and simulation options. Management cockpits, as a communication and collaboration platform, are a starting point and valuable process companion on the way to holistic and sustainable performance management.
Der Beitrag stellt ein zentrales Denkraster für die Konzeption eines ganzheitlichen und langfristigen Performance-Management vor. Darin werden fünf gleichberechtigte Teildisziplinen erläutert, die, in ihrer Ausprägung und Kombination, die Themen- und Gestaltungskomplexität eines Leistungsmanagements aufzeigen. Ziel ist es, durch eine leicht verständliche Systematik das komplexe Thema Performance-Management und seine Zusammenhänge begreifbar und kommunizierbar zu machen, ohne dabei ein allgemeingültiges Rezept zu liefern.
KMUs sehen sich häufig aus finanziellen Gründen nicht in der Lage, in grundlegende Technologien der Industrie 4.0 zu investieren. So wird als Hauptvorbehalt eine vermeintlich schlechte Kosten-Nutzen-Relation bzw. langfristige Pay-Back-Zyklen angegeben. Die aktuellen Herausforderungen liegen derzeit eher bei der immer weiter voranschreitenden Internationalisierung sowie dem ansteigenden Innovationsdruck durch den Wettbewerb. Natürlich ist bekannt, dass die zunehmende Vernetzung der Produktionsanlagen in der Industrie 4.0 zudem Risiken in der IT- und Datensicherheit mit sich bringt. Auch Datenqualitäts-, Stabilitäts-, Schnittstellenprobleme oder rechtliche Probleme sind ausschlaggebend für die Verunsicherung der Unternehmen. Durch die zukünftig immer weiter ansteigende Vernetzung zwischen Unternehmen und Stakeholdern, müssen sich insbesondere Zulieferunternehmen in der Pflicht sehen, das Thema Industrie 4.0 aufzugreifen und sich damit auseinander zu setzen. Gerade diese Unternehmen müssen sich vor Augen führen, dass sie nur durch den zukünftigen Einsatz geeigneter Informations- und Kommunikationstechnologien noch in der Lage sein werden, Teil der Wertschöpfungskette zwischen ihren Kunden und Lieferanten zu sein.
- Die zunehmende Digitalisierung stellt sowohl Unternehmen als auch deren Mitarbeiter vor viele neue Herausforderungen und bietet zugleich ein hohes Chancenpotenzial.
- Die Chance, das mit der Digitalisierung verbundene Potenzial zu heben, ist abhängig von der Konsequenz der Umsetzung der Digitalisierung.
- Voraussetzung für eine erfolgreiche Umsetzung ist eine Digitalisierungsstrategie, die den Weg und die Ziele für alle Beteiligten transparent, nachvollziehbar und erstrebenswert macht.
- Eine gelebte und von allen mitgetragene Digitalisierungsstrategie impliziert eine Kulturveränderung im Unternehmen.
Die digitale Transformation ist der Auslöser dafür, bestehende Produktionsparadigmen in Frage zu stellen bzw. weiterzuentwickeln. Sie bietet produzierenden Unternehmen die Chance, ihre Wertschöpfung grundlegend zu optimieren und neue Geschäftspotenziale zu erschließen.
Im Rahmen von Industrie 4.0 werden die aktuellen Informations- und Kommunikationstechnologien mit der Produktions- und Automatisierungstechnik kombiniert und eine neue Stufe der Organisation und Steuerung der gesamten Wertschöpfungskette über den kompletten Lebenszyklus von Produktien und Services angestrebt. Ziel ist die signifikante Flexibilisierung und Verbesserung der Wertschöpfung sowie eine Individualisierung der Produkte und Services durch eine intensive Kunden-Unternehmens-Interaktion und Vernetzung.
Die digitale Transformation ist der Auslöser dafür, bestehende Produktionsparadigmen in Frage zu stellen bzw. weiterzuentwickeln. Sie bietet produzierenden Unternehmen die Chance, ihre Wertschöpfung grundlegend zu optimieren und neue Geschäftspotenziale zu erschließen.
Im Rahmen von Industrie 4.0 werden die aktuellen Informations- und Kommunikationstechnologien mit der Produktions- und Automatisierungstechnik kombiniert und eine neue Stufe der Organisation und Steuerung der gesamten Wertschöpfungskette über den kompletten Lebenszyklus von Produktien und Services angestrebt.
Ziel ist die signifikante Flexibilisierung und Verbesserung der Wertschöpfung sowie eine Individualisierung der Produkte und Services durch eine intensive Kunden-Unternehmens-Interaktion und Vernetzung.
A holistic approach to digitization enables decision-makers to achieve new efficiency in corporate performance management. The digitalization improves the quality, validity and speed of information retrieval and processing. At present, most corporations are confronted with the problem of not being able to organize, categorize and visualize decision-relevant information. To meet the challenges of information management, the Management Cockpit provides an information center for managers. In accordance with the specific working environment of the executives, the Management Cockpit offers a quick and comprehensive overview of the company's situation. Today, the current situation of a company is no longer only influenced by internal factors, but also by its public image. Social media monitoring and analysis is therefore a crucial component for the external factors of successful management. Real-time monitoring of the emotions and behaviors of consumers and customers thus contributes to effective controlling of allbusiness areas. The intelligent factories promise to collect data for internal factors, but the current reality in manufacturing looks different. Production often consists of a large number of different machines, with varying degrees of digitization and limited sensor data availability. In order to close this gap, we developed a compact sensor board with network components, which allows a flexible design with different sensors for a wide variety of applications. The sensor data enable decision makers to adapt the supply chain based on their internal and external observations in the Management Cockpit. Due to the realtime and long-term monitoring and analytic possibilities the Management Cockpit provides a multi-dimensional view of the company and supports an holistic Corporate Performance Management.
Die einzelne, allumfassende Managementmethode für ein ganzheitliches Leistungsmanagement gibt es nicht. Vielmehr ist das Zusammenspiel aller erfolgskritischen Managementdisziplinen im Rahmen eines integrativen Managementsystems wichtig, bei dem alle Akteure und Beteiligten auch bei unterschiedlichem Fokus und Sichtweise koordiniert an einem Strang ziehen. Erfolgskritisch ist es jedoch, dass eine unternehmensindividuelle Anpassung mit einem ganzheitlichen Erfahrungshintergrund geplant, komponiert und verzahnt wird. Management Cockpits können als Stufenlösung einen wertvollen Beitrag erbringen, indem sie als Integrationsebene eine Transparenz und Kommunikationsplattform für ein ganzheitliches Leistungsmanagement generieren, selbst wenn die vollständige, fachliche, methodische, prozessuale und technische Integration noch nicht komplett vollzogen bzw. erreicht ist.
Unternehmen benötigen heutzutage im globalen Wettbewerb ein effektives und effizientes Leistungsmanagement, um ihren Erfolg langfristig absichern zu können. Ein solches ganzheitliches und langfristiges Performance Managment kann nur dann die Erwartungen erfüllen, wenn alle erfolgskritischen Management-Disziplinen im Rahmen eines integrativen Managementsystems optimal aufeinander abgestimmt sind.
Der Beitrag zeigt, welche grundlegenden Managementmethoden und -instrumente sich identifizieren lassen, um den Unterschied zwischen dauerhaft erfolgreichen und nicht erfolgreichen Unternehmen zu erklären. In diesem Kontext wird ein Ansatz für einen Leistungsmanagement-Gesamtprozess entwickelt, in dem die zentralen Problemquellen bei der Einführung von Performance Management eingeordnet und erläutert werden.
Der Beitrag zeigt, welche grundlegenden Managementmethoden und -instrumente sich identifizieren lassen, um den Unterschied zwischen dauerhaft erfolgreichen und nicht erfolgreichen Unternehmen zu erklären. In diesem Konext wird ein Ansatz für einen Leistungsmanagement-Gesamtprozess entwickelt, in dem die zentralen Problemquellen bei der Einführung von Performance Management eingeordnet und erläutert werden.
Industrie 4.0 - Ausblick
(2016)
Für Unternehmen ist es wichtig, frühzeitig die strategischen Weichen für ihre Industrie 4.0-Stoßrichtung zu stellen und Erfahrung im Umgang mit Industrie 4.0-Technologien aufzubauen. Allerdings werden einige der Industrie 4.0-relevanten Technologien voraussichtlich erst in 5 bis 10 Jahren ihr Effizienzpotential voll ausschöpfen können. Die Einführung von Industrie 4.0 betrifft nahezu alle Bereiche eines Unternehmens und ist deshalb nicht nur als digitale Transformation, sondern auch als Kulturwandel in der Organisation zu verstehen, zu planen und aktiv zu managen. Themen wie Datenschutz und IT-Sicherheit sind nicht nur wichtige Voraussetzungen für eine erfolgreiche Industrie 4.0-Einführung, sondern müssen als wesentliche Akzeptanz- und Erfolgsfaktoren konsequent und durchgängig in den digitalen Systemen verankert werden.
Este trabajo se enmarca dentro del vasto contexto de Ciudades Inteligentes, y se centra en el área de la conducción inteligente de vehículos, tanto en zonas urbanas como interurbanas, mediante la recogida de datos en tiempo real, medidos con sensores, por parte de los propios conductores, así como de datos capturados mediante simulación.
El objetivo de este trabajo es doble. Por un lado, el estudio y aplicación de las diferentes técnicas y métodos de detección de valores atípicos en bases de datos multivariantes, además de una comparativa entre ellos mediante las pruebas llevadas a cabo con datos de tráfico real. Y por otro lado, establecer una relación entre las situaciones anómalas de tráfico, como puedan ser atascos o accidentes, con los valores atípicos multivariantes encontrados.
La detección de valores atípicos representa una de las tareas más importantes a la hora de realizar cualquier análisis de datos, sea cual sea el dominio o área de estudio, ya que entre sus funciones primordiales se encuentra el descubrir información útil, que resulta de gran valor, y que por lo general queda oculta por la alta dimensión de los datos.
Con el uso de mecanismos de detección de valores atípicos junto con métodos de clasificación supervisada, se va a poder llevar a cabo el reconocimiento de elementos de la infraestructura vial urbana como pueden ser rotondas, pasos de cebra, cruces o semáforos.
Checklists are a valuable tool to ensure process quality and quality of care. To ensure proper integration in clinical processes, it would be desirable to generate checklists directly from formal process descriptions. Those checklists could also be used for user interaction in context-aware surgical assist systems. We built a tool to automatically convert Business Process Model and Notation (BPMN) process models to checklists displayed as HTML websites. Gateways representing decisions are mapped to checklist items that trigger dynamic content loading based on the placed checkmark. The usability of the resulting system was positively evaluated regarding comprehensibility and end-user friendliness.
Purpose
Supporting the surgeon during surgery is one of the main goals of intelligent ORs. The OR-Pad project aims to optimize the information flow within the perioperative area. A shared information space should enable appropriate preparation and provision of relevant information at any time before, during, and after surgery.
Methods
Based on previous work on an interaction concept and system architecture for the sterile OR-Pad system, we designed a user interface for mobile and intraoperative (stationary) use, focusing on the most important functionalities like clear information provision to reduce information overload. The concepts were transferred into a high-fidelity prototype for demonstration purposes. The prototype was evaluated from different perspectives, including a usability study.
Results
The prototype’s central element is a timeline displaying all available case information chronologically, like radiological images, labor findings, or notes. This information space can be adapted for individual purposes (e.g., highlighting a tumor, filtering for own material). With the mobile and intraoperative mode of the system, relevant information can be added, preselected, viewed, and extended during the perioperative process. Overall, the evaluation showed good results and confirmed the vision of the information system.
Conclusion
The high-fidelity prototype of the information system OR-Pad focuses on supporting the surgeon via a timeline making all available case information accessible before, during, and after surgery. The information space can be personalized to enable targeted support. Further development is reasonable to optimize the approach and address missing or insufficient aspects, like the holding arm and sterility concept or new desired features.
Im Fokus der Arbeit steht die Unterstützung der Stentgraftauswahl bei endovaskulärer Versorgung eines infrarenalen Aortenaneurysmas. Im Rahmen der Arbeit wurde eine Methode zur Auswertung von Ergebnissen einer Finite Elemente-Analyse zum Stentgraftverhalten konzipiert, implementiert und im Rahmen einer deutschlandweiten Benutzerstudie mit 16 Chirurgen diskutiert. Die entwickelte Mensch-Maschine-Schnittstelle ermöglicht dem Gefäßmediziner eine interaktive Analyse berechneter Fixierungskräfte und Kontaktzustände mehrerer Stentgrafts im Kontext mit dem zu behandelnden Aortenabschnitt. Die entwickelte Methode ermöglicht eine tiefergehende Auseinandersetzung der Mediziner mit numerischen Simulationen und Stentgraftbewertungsgrößen. Hierdurch konnte im Rahmen der Benutzerstudie das Einsatzpotenzial numerischer Simulationen zur Unterstützung der Stentgraftauswahl ermittelt und eine Anforderungsspezifikation an ein System zur simulationsbasierten Stentgraftplanung definiert werden. Im Ergebnis wurde als wesentliches Einsatzpotenzial die Festlegung eines Mindestmaßes an Überdimensionierung, die Optimierung der Schenkellänge von bifurkativen Stentgrafts sowie der Vergleich unterschiedlicher Stentgraftdesigns ermittelt. Zu den wesentlichen Funktionen eines Systems zur simulationsbasierten Stentgraftauswahl gehören eine Übersichtskarte zu farbkodiertem Migrationsrisiko pro Stentgraft und Landungszone, die Visualisierung des Abdichtungszustandes der Stentkomponenten sowie die Darstellung von Stentgraft- und Gefäßdeformationen im 3D-Modell.
Stent graft visualization and planning tool for endovascular surgery using finite element analysis
(2014)
Purpose: A new approach to optimize stent graft selection for endovascular aortic repair is the use of finite element analysis. Once the finite element model is created and solved, a software module is needed to view the simulation results in the clinical work environment. A new tool for Interpretation of simulation results, named Medical Postprocessor, that enables comparison of different stent graft configurations and products was designed, implemented and tested. Methods Aortic endovascular stent graft ring forces and sealing states in the vessel landing zone of three different configurations were provided in a surgical planning software using the Medical Imaging Interaction Tool Kit (MITK) Software system. For data interpretation, software modules for 2D and 3D presentations were implemented. Ten surgeons evaluated the software features of the Medical Postprocessor. These surgeons performed usability tests and answered questionnaires based on their experience with the system.
Results: The Medical Postprocessor visualization system enabled vascular surgeons to determine the configuration with the highest overall fixation force in 16 ± 6 s, best proximal sealing in 56±24 s and highest proximal fixation force in 38 ± 12 s. The majority considered the multiformat data provided helpful and found the Medical Postprocessor to be an efficient decision support system for stent graft selection. The evaluation of the user interface results in an ISONORMconform user interface (113.5 points).
Conclusion: The Medical Postprocessor visualization Software tool for analyzing stent graft properties was evaluated by vascular surgeons. The results show that the software can assist the interpretation of simulation results to optimize stent graft configuration and sizing.
Going forward with the requirements of missions to the Moon and further into deep space, the European Space Agency is investigating new methods of astronaut training that can help accelerate learning, increase availability and reduce complexity and cost in comparison to currently used methods. To achieve this, technologies such as virtual reality may be utilized. In this paper, an investigation into the benefits of using virtual reality as a means for extravehicular activity training in comparison to conventional training methods, such as neutral buoyancy pools is given. To help determine the requirements and current uses of virtual reality for extravehicular activity training first hand tests of currently available software as well as expert interviews are utilized. With this knowledge a concept is developed that may be used to further advance training methods in virtual reality. The resulting concept is used as a basis for development of a prototype to showcase user interactions and locomotion in microgravity simulations.
Digitalization of products and services commonly causes substantial changes in business models, operations, organization structures and IT infrastructures of enterprises. Motivated by experiences and observations from digitalization projects, the paper investigates the effects of digitalization on enterprise architectures (EA). EA models serve as representation of business, information system and technical aspects of an enterprise to support management and development. By comparing EA models before and after digitalization, the paper analyzes the kinds of changes visible in the EA model. The most important finding is that newly created digitized products and the associated (product)- and enterprise architecture are no longer properly integrated into the overall architecture and even exist in parallel. Thus, the focus of this work is on showing these parallel architectures and proposing derivations for a better integration.
Study programs in higher education have to reflect important societal and industrial challenges to prepare the next generations of professionals for future tasks. The focus of this paper is the challenge of digitalization and digital transformation. The paper proposes the IS education profile of a Digital Business Architect (DBA). The study program emphasizes design thinking, model centricity, and capability thinking as a response to domain requirements from digital transformation and educational system and structure requirements. Experiences in implementing the DBA include the need for integrating deductive and inductive teaching, a strong basis in real-world cases, and collaborative learning approaches to develop adequate competences in business model management, enterprise modeling, enterprise architecture management, and capability management.
Digital technologies are main strategic drivers for digitalization and offer ubiquitous data availability, unlimited connectivity, and massive processing power for a fundamentally changing business. This leads to the development and application of intelligent digital systems. The current state of research and practice of architecting digital systems and services lacks a solid methodological foundation that fully accommodates all requirements linked to efficient and effective development of digital systems in organizations. Research presented in this paper addresses the question, how management of complexity in digital systems and architectures can be supported from a methodological perspective. In this context, the current focus is on a better understanding of the causes of increased complexity and requirements to methodological support. For this purpose, we take an enterprise architecture perspective, i.e. how the introduction of digital systems affects the complexity of EA. Two industrial case studies and a systematic literature analysis result in the proposal of an extended Digital Enterprise Architecture Cube as framework for future methodical support.
Context:
Test-driven development (TDD) is an agile software development approach that has been widely claimed to improve software quality. However, the extent to which TDD improves quality appears to be largely dependent upon the characteristics of the study in which it is evaluated (e.g., the research method, participant type, programming environment, etc.). The particularities of each study make the aggregation of results untenable.
Objectives:
The goal of this paper is to: increase the accuracy and generalizability of the results achieved in isolated experiments on TDD, provide joint conclusions on the performance of TDD across different industrial and academic settings, and assess the extent to which the characteristics of the experiments affect the quality-related performance of TDD.
Method:
We conduct a family of 12 experiments on TDD in academia and industry. We aggregate their results by means of meta-analysis. We perform exploratory analyses to identify variables impacting the quality-related performance of TDD.
Results:
TDD novices achieve a slightly higher code quality with iterative test-last development (i.e., ITL, the reverse approach of TDD) than with TDD. The task being developed largely determines quality. The programming environment, the order in which TDD and ITL are applied, or the learning effects from one development approach to another do not appear to affect quality. The quality-related performance of professionals using TDD drops more than for students. We hypothesize that this may be due to their being more resistant to change and potentially less motivated than students.
Conclusion:
Previous studies seem to provide conflicting results on TDD performance (i.e., positive vs. negative, respectively). We hypothesize that these conflicting results may be due to different study durations, experiment participants being unfamiliar with the TDD process, or case studies comparing the performance achieved by TDD vs. the control approach (e.g., the waterfall model), each applied to develop a different system. Further experiments with TDD experts are needed to validate these hypotheses.
In summary, we believe that current “sleep monitoring” consumer devices on the market must undergo a more robust validation process before being made available and distributed in the general public. This is especially noteworthy as there have been first reports in the literature that inaccurate feedback of such consumer devices can worry subjects and may even lead to compromised well-being of the user.
Data collected from internet applications are mainly stored in the form of transactions. All transactions of one user form a sequence, which shows the user´s behaviour on the site. Nowadays, it is important to be able to classify the behaviour in real time for various reasons: e.g. to increase conversion rate of customers while they are in the store or to prevent fraudulent transactions before they are placed. However, this is difficult due to the complex structure of the data sequences (i.e. a mix of categorical and continuous data types, constant data updates) and the large amounts of data that are stored. Therefore, this thesis studies the classification of complex data sequences. It surveys the fields of time series analysis (temporal data mining), sequence data mining or standard classification algorithms. It turns out that these algorithms are either difficult to be applied on data sequences or do not deliver a classification: Time series need a predefined model and are not able to handle complex data types; sequence classification algorithms such as the apriori algorithm family are not able to utilize the time aspect of the data. The strengths and weaknesses of the candidate algorithms are identified and used to build a new approach to solve the problem of classification of complex data sequences. The problem is thereby solved by a two-step process. First, feature construction is used to create and discover suitable features in a training phase. Then, the blueprints of the discovered features are used in a formula during the classification phase to perform the real time classification. The features are constructed by combining and aggregating the original data over the span of the sequence including the elapsed time by using a calculated time axis. Additionally, a combination of features and feature selection are used to simplify complex data types. This allows catching behavioural patterns that occur in the course of time. This new proposed approach combines techniques from several research fields. Part of the algorithm originates from the field of feature construction and is used to reveal behaviour over time and express this behaviour in the form of features. A combination of the features is used to highlight relations between them. The blueprints of these features can then be used to achieve classification in real time on an incoming data stream. An automated framework is presented that allows the features to adapt iteratively to a change in underlying patterns in the data stream. This core feature of the presented work is achieved by separating the feature application step from the computational costly feature construction step and by iteratively restarting the feature construction step on the new incoming data. The algorithm and the corresponding models are described in detail as well as applied to three case studies (customer churn prediction, bot detection in computer games, credit card fraud detection). The case studies show that the proposed algorithm is able to find distinctive information in data sequences and use it effectively for classification tasks. The promising results indicate that the suggested approach can be applied to a wide range of other application areas that incorporate data sequences.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nontheless, in real life history is not always repeatable, i.e. in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. Compared to other techniques this novel approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 demonstrate better results than established sophisticated time series methods.
Online credit card fraud presents a significant challenge in the field of eCommerce. In 2012 alone, the total loss due to credit card fraud in the US amounted to $ 54 billion. Especially online games merchants have difficulties applying standard fraud detection algorithms to achieve timely and accurate detection. This paper describes the Special constrains of this domain and highlights the reasons why conventional algorithms are not quite effective to deal with this problem. Our suggested solution for the problem originates from the fields of feature construction joined with the field of temporal sequence data mining. We present Feature construction techniques, which are able to create discriminative features based on a sequence of transaction and are able to incorporate the time into the classification process. In addition to that, a framework is presented that allows for an automated and adaptive change of features in case the underlying pattern is changing.
The recent years and especially the Internet have changed the ways in which data is stored. It is now common to store data in the form of transactions, together with ist creation time-stamp. These transactions can often be attributed to Logical units, e.g., all transactions that belong to one customer. These groups, we refer to them as data sequences, have a more complex structure than tuple-based data. This makes it more difficult to find discriminatory patterns for classification purposes. However, the complex structure potentially enables us to track behaviour and its change over the course of time. This is quite interesting, especially in the e-commerce area, in which classification of a sequence of customer actions is still a challenging task for data miners. However, before standard algorithms such as Decision Trees, Neural Nets, Naive Bayes or Bayesian Belief Networks can be applied on sequential data, preparations are required in order to capture the information stored within the sequences. Therefore, this work presents a systematic approach on how to reveal sequence patterns among data and how to construct powerful features out of the primitive sequence attributes. This is achieved by sequence aggregation and the incorporation of time dimension into the feature construction step. The proposed algorithm is described in detail and applied on a real-life data set, which demonstrates the ability of the proposed algorithm to boost the classification performance of well-known data mining algorithms for binary classification tasks.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nonetheless, in real life history is not always repeatable, i.e., in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction based on a calculated periodicity. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. The periodicity is calculated based on a novel approach that is based on data folding and Pearson Correlation. Compared to other techniques this approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 as well as artificial data demonstrate better results than established sophisticated time series methods.
The recent years and especially the Internet have changed the way on how data is stored. We now often store data together with its creation time-stamp. These data sequences potentially enable us to track the change of data over time. This is quite interesting, especially in the e-commerce area, in which classification of a sequence of customer actions, is still a challenging task for data miners. However, before Standard algorithms such as Decision Trees, Neuronal Nets, Naive Bayes or Bayesian Belief Networks can be applied on sequential data, preparations need to be done in order to capture the information stored within the sequences. Therefore, this work presents a systematic approach on how to reveal sequence patterns among data and how to construct powerful features out of the primitive sequence attributes. This is achieved by sequence aggregation and the incorporation of time dimension into the Feature construction step. The proposed algorithm is described in detail and applied on a real life data set, which demonstrates the ability of the proposed algorithm to boost the classification performance of well known data mining algorithms for classification tasks.
A sequence of transactions represents a complex and multi dimensional type of data. Feature construction can be used to reduce the data´s dimensionality to find behavioural patterns within such sequences. The patterns can be expressed using the blue prints of the constructed relevant features. These blue prints can then be used for real time classification on other sequences.
The relevance of Robotic Process Automation (RPA) has increased over the last few years. Combining RPA with Artificial Intelligence (AI) can further enhance the business value of the technology. The aim of this research was to analyze applications, terminology, benefits, and challenges of combining the two technologies. A total of 60 articles were analyzed in a systematic literature review to evaluate the aforementioned areas. The results show that by adding AI, RPA applications can be used in more complex contexts, it is possible to minimize the human factor during the development process, and AI-based decision-making can be integrated into RPA routines. This paper also presents a current overview of the used terminology. Moreover, it shows that by integrating AI, some unseen challenges in RPA projects can emerge, but also a lot of new benefits will come along with it. Based on the outcome, it is concluded that the topic offers a lot of potential, but further research and development is required. The result of this study help researches to gain an overview of the state-of-the-art in combining RPA and AI.
Introduction: Even if there is a standard procedure of CI surgery, especially in pediatric surgery surgical steps often differ individually due to anatomical variations, malformations or unforseen events. This is why every surgical report should be created individually, which takes time and relies on the correct memory of the surgeon. A standardized recording of intraoperative data and subsequent storage as well as text processing would therefore be desirable and provides the basis for subsequent data processing, e.g. in the context of research or quality assurance.
Method: In cooperation with Reutlingen University, we conducted a workflow analysis of the prototype of a semi-automatic checklist tool. Based on automatically generated checklists generated from BPMN models a prototype user interface was developed for an android tablet. Functions such as uploading photos and files, manual user entries, the interception of foreseeable deviations from the normal course of operations and the automatic creation of OP documentation could be implemented. The system was tested in a remote usability test on a petrous bone model.
Result: The user interface allows a simple intuitive handling, which can be well implemented in the intraoperative setting. Clinical data as well as surgical steps could be individually recorded and saved via DICOM. An automatic surgery report could be created and saved.
Summary: The use of a dynamic checklist tool facilitates the capture, storage and processing of surgical data. Further applications in clinical practice are pending.
Transforming our food system is important to achieving global climate neutrality and food security. Germany has set a national target of reaching a 30% share in organic farming to support the goal. When looking at the transformation process from conventional to organic farming, it becomes apparent that measures need to be taken to reach this anticipated goal. A particular emphasis of this work is placed on finding a digital solution and process improvements to ensure longevity and efficiency. Interviews with actors along the farm-to-fork value chain were conducted to identify central barriers and drivers of organic transformation. The results of the interviews show firstly, that three subsystems need to be distinguished when talking about the farm-to-fork value chain: (1) farmers, (2) intermediaries, and (3) the canteen system. Although all three subsystems can be combined to form a coherent value chain, they rarely act and communicate beyond the boundaries of their subsystem. Secondly, we were able to allocate primary barriers and drivers to each of the subsystems, highlighting the need to include all three in the transformation process and aim for a comprehensive digital solution. This work explores the potential of a network-based platform to improve the current practice of rigid and strictly hierarchical value chains. We focus on deriving user requirements from the interviews to describe the necessary functionality of the platform to address the identified barriers and exploit existing drivers.
Über die letzten Jahre nehmen die Verkehrsunfälle zwischen Fahrzeugen und Fahrrädern und Motorrädern immer weiter zu. Die Ursache ist meistens eine unachtsam geöffnete Fahrzeugtür, mit der ein anfahrender Zweiradfahrer kollidiert. Diese Unfälle können verheerende Folgen für alle Beteiligten haben. Aus diesem Grund soll der technologische Fortschritt, welcher sich durch die gesamte Automobilbranche zieht, um eine zusätzliche Komponente erweitert werden. Hierbei handelt es sich um ein System für Zweiradfahrer, welches eine sich öffnende Fahrzeugtür frühzeitig erkennen soll, den Fahrer somit frühzeitig warnen kann und gegebenenfalls Strategien zu Unfallvermeidung einleiten kann. Dieses Konzept soll die Entwicklung des Systems vorgeben und grundlegend erläutern.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
While driving, stress is caused by situations in which the driver estimates their ability to manage the driving demands as insufficient or loses the capability to handle the situation. This leads to increased numbers of driver mistakes and traffic violations. Additional stressing factors are time pressure, road conditions, or dislike for driving. Therefore, stress affects driver and road safety. Stress is classified into two categories depending on its duration and the effects on the body and psyche: short-term eustress and constantly present distress, which causes degenerative effects. In this work, we focus on distress. Wearable sensors are handy tools for collecting biosignals like heart rate, activity, etc. Easy installation and non-intrusive nature make them convenient for calculating stress. This study focuses on the investigation of stress and its implications. Specifically, the research conducts an analysis of stress within a select group of individuals from both Spain and Germany. The primary objective is to examine the influence of recognized psychological factors, including personality traits such as neuroticism, extroversion, psychoticism, stress and road safety. The estimation of stress levels was accomplished through the collection of physiological parameters (R-R intervals) using a Polar H10 chest strap. We observed that personality traits, such as extroversion, exhibited similar trends during relaxation, with an average heart rate 6% higher in Spain and 3% higher in Germany. However, while driving, introverts, on average, experienced more stress, with rates 4% and 1% lower than extroverts in Spain and Germany, respectively.
To evaluate the quality of sleep, it is important to determine how much time was spent in each sleep stage during the night. The gold standard in this domain is an overnight polysomnography (PSG). But the recording of the necessary electrophysiological signals is extensive and complex and the environment of the sleep laboratory, which is unfamiliar to the patient, might lead to distorted results. In this paper, a sleep stage detection algorithm is proposed that uses only the heart rate signal, derived from electrocardiogram (ECG), as a discriminator. This would make it possible for sleep analysis to be performed at home, saving a lot of effort and money. From the heart rate, using the fast Fourier transformation (FFT), three parameters were calculated in order to distinguish between the different sleep stages. ECG data along with a hypnogram scored by professionals was used from Physionet database, making it easy to compare the results. With an agreement rate of 41.3%, this approach is a good foundation for future research.
Stress is recognized as a factor of predominant disease and in the future the costs for treatment will increase. The presented approach tries to detect stress in a very basic and easy to implement way, so that the cost for the device and effort to wear it remain low. The user should benefit from the fact that the system offers an easy interface reporting the status of his body in real time. In parallel, the system provides interfaces to pass the obtained data forward for further processing and (professional) analyses, in case the user agrees. The system is designed to be used in every day’s activities and it is not restricted to laboratory use or environments. The implementation of the enhanced prototype shows that the detection of stress and the reporting can be managed using correlation plots and automatic pattern recognition even on a very light weighted microcontroller platform.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
The investigation of stress requires to distinguish between stress caused by physical activity and stress that is caused by psychosocial factors. The behaviour of the heart in response to stress and physical activity is very similar in case the set of monitored parameters is reduced to one. Currently, the differentiation remains difficult and methods which only use the heart rate are not able to differentiate between stress and physical activity, without using additional sensor data input. The approach focusses on methods which generate signals providing characteristics that are useful for detecting stress, physical activity, no activity and relaxation.
The purpose of this paper is to examine the effects of perceived stress on traffic and road safety. One of the leading causes of stress among drivers is the feeling of having a lack of control during the driving process. Stress can result in more traffic accidents, an increase in driver errors, and an increase in traffic violations. To study this phenomenon, the Stress Perceived Questionnaire (PSQ) was used to evaluate the perceived stress while driving in a simulation. The study was conducted with participants from Germany, and they were grouped into different categories based on their emotional stability. Each participant was monitored using wearable devices that measured their instantaneous heart rate (HR). The preference for wearable devices was due to their non-intrusive and portable nature. The results of this study provide an overview of how stress can affect traffic and road safety, which can be used for future research or to implement strategies to reduce road accidents and promote traffic safety.
Methods based exclusively on heart rate hardly allow to differentiate between physical activity, stress, relaxation, and rest, that is why an additional sensor like activity/movement sensor added for detection and classification. The response of the heart to physical activity, stress, relaxation, and no activity can be very similar. In this study, we can observe the influence of induced stress and analyze which metrics could be considered for its detection. The changes in the Root Mean Square of the Successive Differences provide us with information about physiological changes. A set of measurements collecting the RR intervals was taken. The intervals are used as a parameter to distinguish four different stages. Parameters like skin conductivity or skin temperature were not used because the main aim is to maintain a minimum number of sensors and devices and thereby to increase the wearability in the future.
Segmentierung von Polypen in Koloskopie-Bilddaten : eine Potentialanalyse von Deep-Learning-Methoden
(2018)
Kolorektale Karzinome haben eine hohe Sterblichkeitsrate, wenn sie spät entdeckt werden. Eine frühzeitige Entfernung von bösartigen Polypen im Magen-Darm-Trakt, die deren Vorstufen bilden, bietet jedoch hohe Überlebenschancen. Bei Darmspiegelungen werden gerade kleine Polypen aber recht häufig übersehen. Zuverlässige bildverarbeitende Systeme, die Polypen in einem Koloskopie-Frame nicht nur detektieren, sondern pixelgenau segmentieren, könnten Ärzten bei Darmkrebs-Screenings helfen. Diese Arbeit analysiert den aktuellen Stand der Segmentierung von Polypen im Gastrointestinaltrakt. Weiterführend wird untersucht, inwiefern die in letzter Zeit sehr erfolgreichen Methoden des Deep Learning hier Vorteile bieten.
This paper contributes to the automatic detection of perioperative workflow by developing a binary endoscope localization. Automated situation recognition in the context of an intelligent operating room requires the automatic conversion of low level cues into more abstract high level information. Imagery from a laparoscope delivers rich content that is easy to obtain but hard to process. We introduce a system which detects if the endoscope's distal tip is inside or outsiede the patient based on the endoscope video. This information can be used as one parameter in a situation recognition pipeline. Our localization performs in real-time at a video resolution of 1280x720 and 5-fold cross validation yields mean F1-scores of up to 0,94 on videos of 7 laparoscopies.
Knowledge is an important resource, whose transfer is still not completely understood. The underlying belief of this thesis is that knowledge cannot be transferred directly from one person to another but must be converted for the transfer and therefore is subject to loss of knowledge and misunderstanding. This thesis proposes a new model for knowledge transfer and empirically evaluates this model. The model is based on the belief that knowledge must be encoded by the sender to transfer it to the receiver, who has to decode the message to obtain knowledge.
To prepare for the model this thesis provides an overview about models for knowledge transfer and factors that influence knowledge transfer. The proposed theoretical model for knowledge transfer is implemented in a prototype to demonstrate its applicability. The model describes the influence of the four layers, namely code, syntactic, semantic, and pragmatic layers, on the encoding and decoding of the message. The precise description of the influencing factors and the overlapping knowledge from sender and receiver facilitate its implementation.
The application area of the layered model for knowledge transfer was chosen to be business process modelling. Business processes incorporate an important knowledge resource of an organisation as they describe the procedures for the production of products and services. The implementation in a software prototype allows a precise description of the process by adding semantic to the simple business process modelling language used.
This thesis contributes to the body of knowledge by providing a new model for knowledge transfer, which shows the process of knowledge transfer in greater detail and highlights influencing factors. The implementation in the area of business process modelling reveals the support provided by the model. An expert evaluation indicates that the implementation of the proposed model supports knowledge transfer in business process modelling. The results of the qualitative evaluation are supported by the findings of a qualitative evaluation, performed as a quasi-experiment with a pre-test/post-test design and two experimental groups and one control group. Mann-Whitney U tests indicated that the group that used the tool that implemented the layered model performed significantly better in terms of completeness (the degree of completeness achieved in the transfer) in comparison with the group that used a standard BPM tool (Z = 3.057, p = 0.002, r = 0.59) and the control group that used pen and paper (Z = 3.859, p < 0.001, r = 0.72). The experiment indicates that the implementation of the layered model supports the creation of a business process and facilitates a more precise representation.
Knowledge transfer is very important to our knowledge-based society and many approaches have been proposed to describe this transfer. However, these approaches take a rather abstract view on knowledge transfer, which makes implementation difficult. In order to address this issue, we introduce a layered model for knowledge transfer that structures the individual steps of knowledge transfer in more detail. This paper gives a description of the process and also an example of the application of the layered model for knowledge transfer. The example is located in the area of business process modelling. Business processes contain the important knowledge describing the procedures of the company to produce products and services. Knowledge transfer is the fundamental basis in the modelling and usage of Business processes, which makes it an interesting use case for the layered model for knowledge transfer.
Business processes are important knowledge resources of a company. The knowledge contained in business processes impart procedures used to create products and services. However, modelling and application of business processes are affected by problems connected to knowledge transfer. This paper presents and implements a layered model to improve the knowledge transfer. Thus modelling and understanding of business process models is supported. An evaluation of the approach is presented and results and other areas of application are discussed.
Learning and teaching requires the transfer of knowledge from one person to another. Due to the relevance of knowledge many models have been developed for knowledge transfer. However, the process of knowledge transfer has not yet been described completely and the approaches are too vague to facilitate its implementation. This paper contributes to a better understanding of knowledge transfer to support knowledge transfer in teaching. To address this challenge, we depict a layered model for knowledge transfer. The model structures the transfer in several steps and thus identifies major influencing factors. The paper describes the knowledge transfer from one person to another step by step. An example in the area of teaching business process management illuminates the process. The main contribution of this paper is the development of a layered model and its application in teaching.
Workshop Java EE 7 : ein praktischer Einstieg in die Java Enterprise Edition mit dem Web Profile
(2015)
Dieses Arbeitsbuch bietet Ihnen eine praktische Einführung in die Entwicklung von Business- Anwendungen mit Java EE 7. Schrittweise erstellen Sie eine einfach nachvollziehbare Beispielanwendung auf Grundlage des Web Profile. Dabei lernen Sie alle wichtigen Technologien und Konzepte von Java EE 7 kennen, u.a.: Grafische Oberflächen mit JavaServer Faces und HTML5; Business-Logik mit CDI und EJB; Persistenz mit JPA; Kommunikation mit REST, SOAP und WebSockets; Erweiterte Konzepte wie Resource Library Contracts, Interceptors, Transaktionen, Timer und Security. Über Java EE 7 hinaus wird auch auf weitere praxisrelevante Themen wie Build Management und Testing eingegangen. Das Deployment wird auf den Applikationsservern WildFly 8 und Glassfish 4 sowie über das Cloud-Angebot OpenShift durchgeführt. Am Ende einer jeden Entwicklungsphase finden Sie Übungen und Fragen zur Lernkontrolle.Nach der erfolgreichen Lektüre sind Sie in der Lage, Java-EE-7-Anwendungen selbständig aufzusetzen, zu entwickeln und auf einem Anwendungsserver zu verteilen. Kenntnisse in der Entwicklung mit Java werden vorausgesetzt. Grundlagen von HTML und der Architektur von Webanwendungen sind hilfreich. In der 2. Auflage wird nun auch die Internationalisierung sowie die Erstellung funktionaler Tests mit Graphene behandelt.
A large body of literature is concerned with models of presence— the sensory illusion of being part of a virtual scene— but there is still no general agreement on how to measure it objectively and reliably. For the presented study, we applied contemporary theory to measure presence in virtual reality. Thirty-seven participants explored an existing commercial game in order to complete a collection task. Two startle events were naturally embedded in the game progression to evoke physical reactions and head tracking data was collected in response to these events. Subjective presence was recorded using a post-study questionnaire and real-time assessments. Our novel implementation of behavioral measures lead to insights which could inform future presence research: We propose a measure in which startle reflexes are evoked through specific events in the virtual environment, and head tracking data is compared to the range and speed of baseline interactions.
Private equity (PE) firms are investment firms that acquire equity shares in companies. The goal of PE firms is to exit the investment after few years with a substantial increase in value. PE firms often claim to outperform the market, i.e. to create alpha.
The overall aim of this paper is to unravel the mystery of value creation in the PE industry. First, the author presents a conceptual framework for value creation in the PE industry based on a multiple valuation model that breaks down value creation into different elements. Second, the paper evaluates whether PE firms really create value by analysing and combining results from prior empirical studies based on the conceptual framework.
The results show that existing empirical evidence is mixed but that there is indeed a tendency toward a positive evidence that PE firms create economic value in average. However, there are methodological difficulties in measuring the value creation and studies are often subject to bias. Finally, it is pointed out that the question whether PE firms really create value has to be viewed from different perspectives such as the perspective of the PE firm, the investors and the portfolio companies.
The automation of work by means of disruptive technologies such as Artificial Intelligence (AI) and Robotic Process Automation (RPA) is currently intensely discussed in business practice and academia. Recent studies indicate that many tasks manually conducted by humans today will not in the future. In a similar vein, it is expected that new roles will emerge. The aim of this study is to analyze prospective employment opportunities in the context of RPA in order to foster our understanding of the pivotal qualifications, expertise and skills necessary to find an occupation in a completely changing world of work. This study is based on an explorative, content analysis of 119 job advertisements related to RPA in Germany. The data was collected from major German online job platforms, qualitatively coded, and subsequently analyzed quantitatively. The research indicates that there indeed are employment opportunities, especially in the consulting sector. The positions require different technological expertise such as specific programming languages and knowledge in statistics. The results of this study provide guidance for organizations and individuals on reskilling requirements for future employment. As many of the positions require profound IT expertise, the generally accepted perspective that existing employees affected by automation can be retrained to work in the emerging positions has to be seen extremely critical. This paper contributes to the body of knowledge by providing a novel perspective on the ongoing discussion of employment opportunities, and reskilling demands of the existing workforce in the context of recent technological developments and automation.
Purpose
Digital transformation of organizations has major implications for required skills and competencies of the workforce, both as a prerequisite for implementation, and, as a consequence of the transformation. The purpose of this study is to analyze required skills and competencies for digital transformation using the context of robotic process automation (RPA) as an example.
Design/methodology/approach
This study is based on an explorative, thematic coding analysis of 119 job advertisements related to RPA. The data was collected from major online job platforms, qualitatively coded and subsequently analyzed quantitatively.
Findings
The research highlights the general importance of specific skills and competencies for digital transformation and shows a gap between available skills and required skills. Moreover, it is concluded that reskilling the existing workforce might be difficult. Many emerging positions can be found in the consulting sector, which raises questions about the permanent vs temporary nature of the requirements, as well as the difficulty of acquiring the required knowledge.
Originality/value
This paper contributes to knowledge by providing new empirical findings and a novel perspective to the ongoing discussion of digital skills, employment effects and reskilling demands of the existing workforce owing to recent technological developments and automation in the overall context of digital transformation.
In the last few years, business firms have substantially invested into the artificial intelligence (AI) technology. However, according to several studies, a significant percentage of AI projects fail or do not deliver business value. Due to the specific characteristics of AI projects, the existing body of knowledge about success and failure of information systems (IS) projects in general may not be transferrable to the context of AI. Therefore, the objective of our research has been to identify factors that can lead to AI project failure. Based on interviews with AI experts, this article identifies and discusses 12 factors that can lead to project failure. The factors can be further classified into five categories: unrealistic expectations, use case related issues, organizational constraints, lack of key resources, and, technological issues. This research contributes to knowledge by providing new empirical data and synthesizing the results with related findings from prior studies. Our results have important managerial implications for firms that aim to adopt AI by helping the organizations to anticipate and actively manage risks in order to increase the chances of project success.
In the upcoming years, huge benefits are expected from Artificial Intelligence (AI). However, there are also risks involved in the technology, such as accidents of autonomous vehicles or discrimination by AI-based recruitment systems. This study aims to investigate public perception of these risks, focusing on realistic risks of Narrow AI, i.e., the type of AI that is already productive today. Based on perceived risk theory, several risk scenarios are examined using data from an exploratory survey. This research shows that AI is perceived positively overall. The participants, however, do evaluate AI critically when being confronted with specific risk scenarios. Furthermore, a strong positive relationship between knowledge about AI and perceived risk could be shown. This study contributes to knowledge by advancing our understanding of the awareness and evaluation of the risks by consumers and has important implications for product development, marketing and society.
Since half a decade, there has been an increasing interest in Robotic Process Automation (RPA) by business firms. However, academic literature has been lacking attention to RPA, before adopting the topic to a larger extent. The aim of this study is to review and structure the latest state of scholarly research on RPA. This chapter is based on a systematic literature review that is used as a basis to develop a conceptual framework to structure the field. Our study shows that some areas of RPA have been extensively examined by many authors, e.g. potential benefits of RPA. Other categories, such as empirical studies on adoption of RPA or organisational readiness models, have remained research gaps.
In the context of digital transformation, having a data-driven organizational culture has been recognized as an important factor for data analytics capabilities, innovativeness and competitive advantage of firms. However, the current literature on data-driven culture (DDC) is fragmented, lacking both a synthesis of findings and a theoretical foundation. Therefore, the aim of this work has been to develop a comprehensive framework for understanding DDC and the mechanisms that can be used to embed such a culture in organizations as well as structuring prior dispersed findings on the topic. Based on the foundation of organizational culture theory, we employed a Design Science Research (DSR) approach using a systematic literature review and expert interviews to build and evaluate a transformation-oriented framework. This research contributes to knowledge by synthesizing previously dispersed knowledge in a holistic framework, as well as, by providing a conceptual framework to guide the transformation towards a DDC.
Unternehmen wenden insbesondere bei IT-nahen Projekten seit einigen Jahren auch im Controlling verstärkt ein agiles Vorgehen an. Erfahrungen zeigen jedoch, dass dies nicht bei allen Projekten in jedem Unternehmen funktioniert. Hybride Ansätze, die agile mit klassischen Projekt-Management-Methoden verbinden, bieten eine Lösung.
Haptisches Feedback ist nach zahlreichen Studien ein wichtiger Bestandteil in der medizinischen Robotik. Die meisten Systeme befinden sich jedoch noch im Forschungsstatus und verfolgen unterschiedliche Ansätze. In der Teleoperation wird mit sensorlosen und Sensor-Systemen geforscht. Sensoren bieten, im Gegensatz zu den Encodern in sensorlosen Systemen, genaue Messungen, sind allerdings teuer in der Anschaffung, schwer zu desinfizieren und müssen in OP-Besteck integriert werden. In Hands-On Systemen fühlt der Operateur im Gegensatz zu Teleoperationssystemen direkt die auftretenden Kräfte bei der Benutzung. Der Roboter bietet in diesen Systemen nur die benötigte Stabilität und Genauigkeit, gesteuert werden sie direkt durch den Menschen. Dagegen werden in Teleoperationssystemen gezielte Controller eingesetzt. Hier hat sich der für den OP entwickelte sigma.7 durchgesetzt. Gegenüber der für die Allgemeinheit entwickelten Konkurrenz bietet er haptisches Feedback in allen nötigen Freiheitsgraden und eine entsprechende Kraftrückkoppelung.
Assistant platforms are becoming a key element for the business model of many companies. They have evolved from assistance systems that provide support when using information (or other) systems to platforms in their own. Alexa, Cortana or Siri may be used with literally thousands of services. From this background, this paper develops the notion of assistant platforms and elaborates a conceptual model that supports businesses in developing appropriate strategies. The model consists of three main building blocks, an architecture that depicts the components as well as the possible layers of an assistant platform, the mechanism that determines the value creation on assistant platforms, and the ecosystem with its network effects, which emerge from the multi-sided nature of assistant platforms. The model has been derived from a literature review and is illustrated with examples of existing assistant platforms. Its main purpose is to advance the understanding of assistant platforms and to trigger future research.
Digital assistants like Alexa, Google Assistant or Siri have seen a large adoption over the past years. Using artificial intelligence (AI) technologies, they provide a vocal interface to physical devices as well as to digital services and have spurred an entire new ecosystem. This comprises the big tech companies themselves, but also a strongly growing community of developers that make these functionalities available via digital platforms. At present, only few research is available to understand the structure and the value creation logic of these AI-based assistant platforms and their ecosystem. This research adopts ecosystem intelligence to shed light on their structure and dynamics. It combines existing data collection methods with an automated approach that proves useful in deriving a network-based conceptual model of Amazon’s Alexa assistant platform and ecosystem. It shows that skills are a key unit of modularity in this ecosystem, which is linked to other elements such as service, data, and money flows. It also suggests that the topology of the Alexa ecosystem may be described using the criteria reflexivity, symmetry, variance, strength, and centrality of the skill coactivations. Finally, it identifies three ways to create and capture value on AI-based assistant platforms. Surprisingly only a few skills use a transactional business model by selling services and goods but many skills are complementary and provide information, configuration, and control services for other skill provider products and services. These findings provide new insights into the highly relevant ecosystems of AI-based assistant platforms, which might serve enterprises in developing their strategies in these ecosystems. They might also pave the way to a faster, data-driven approach for ecosystem intelligence.
Assistant platforms
(2023)
Many assistant systems have evolved toward assistant platforms. These platforms combine a range of resources from various actors via a declarative and generative interface. Among the examples are voice-oriented assistant platforms like Alexa and Siri, as well as text-oriented assistant platforms like ChatGPT and Bard. They have emerged as valuable tools for handling tasks without requiring deeper domain expertise and have received large attention with the present advances in generative artificial intelligence. In view of their growing popularity, this Fundamental outlines the key characteristics and capabilities that define assistant platforms. The former comprise a multi-platform architecture, a declarative interface, and a multi-platform ecosystem, while the latter include capabilities for composition, integration, prediction, and generativity. Based on this framework, a research agenda is proposed along the capabilities and affordances for assistant platforms.
Platforms feature increasingly complex architectures with regard to interconnecting with other digital platforms as well as with a variety of devices and services. This development also impacts the structure of digital platform ecosystems and forces providers of these services, devices, and services to incorporate this complexity in their decision-making. To contribute to the existing body of knowledge on measuring ecosystem complexity, the present research proposes two key artefacts based on ecosystem intelligence: On the one hand, complementarity graphs represent ecosystems with an ecosystem's functional modules as vertices and complementarities as edges. The nodes carry information about the category membership of the module. On the other hand, a process is suggested that can collect important information for ecosystem intelligence using proxies and web scraping. Our approach allows replacing data, which today is largely unavailable due to competitive reasons. We demonstrated the use of the artefacts in category-oriented complementarity maps that aggregate the information from complementarity graphs and support decision-making. They show which combination of module categories creates strong and weak complementarities. The paper evaluates complementarity maps and the data collection process by creating category-oriented complementarity graphs on the Alexa skill ecosystem and concludes with a call to pursue more research based on functional ecosystem intelligence.
Potentials of smart contracts-based disintermediation in additive manufacturing supply chains
(2019)
We investigate which potentials are created by using smart contracts for disintermediation in supply chains for additive manufacturing. Using a qualitative, critical realist research approach, we analyzed three case studies with companies active in additive manufactures. Based on interviews with experts from these companies, we could identify eight key requirements for disintermediation and associate four potentials of smart contracts-based disintermediation.
Digitization transforms business process models and processes in many enterprises. However, many of them need guidance, how digitization is impacting the design of their information systems. Therefore, this paper investigates the influence of digitization on information system design. We apply a two-phase research method applying a literature review and an exploratory case study. The case study took place in the IT service provider of a large insurance enterprise. The study’s results suggest that a number of areas of information system design are affected, such as architecture, processes, data and services.