Informatik
Refine
Year of publication
Document Type
- Conference proceeding (567)
- Journal article (198)
- Book chapter (62)
- Doctoral Thesis (18)
- Book (10)
- Anthology (10)
- Patent / Standard / Guidelines (2)
- Report (2)
- Working Paper (2)
Is part of the Bibliography
- yes (871)
Institute
- Informatik (871)
- Technik (2)
Publisher
- Springer (173)
- Hochschule Reutlingen (104)
- IEEE (89)
- Gesellschaft für Informatik (60)
- Elsevier (46)
- ACM (33)
- IARIA (26)
- Springer Gabler (15)
- De Gruyter (12)
- Association for Information Systems (AIS) (11)
- RWTH Aachen (11)
- Università Politecnica delle Marche (11)
- MDPI (10)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (9)
- SCITEPRESS (8)
- Haufe (7)
- IOS Press (7)
- Emerald (6)
- University of Hawai'i at Manoa (6)
- Association for Computing Machinery (5)
- SPIE (5)
- Fac. of Organization & Informatics, Univ. of Zagreb (4)
- IGI Global (4)
- RWTH (4)
- Springer International Publishing (4)
- University of Hawaii at Manoa (4)
- Universität Tübingen (4)
- American Marketing Association (3)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e. V. (3)
- International Academy of Business Disciplines (3)
- Open Proceedings.org, Univ. of Konstanz (3)
- Riga Technical University Press (3)
- Sage (3)
- Science and Technology Publications (3)
- Springer Science + Business Media B.V (3)
- Springer Science + Business Media B.V. (3)
- University of Konstanz, University Library (3)
- Wiley-Blackwell (3)
- Academic Conferences International (2)
- American Marketing Assoc. (2)
- BioMed Central (2)
- CSW-Verlag (2)
- Curran Associates (2)
- Curran Associates Inc. (2)
- Deutsche Aktuarvereinigung (DAV) e.V. (2)
- EuroMed Press (2)
- GMDS e.V. (2)
- Gabler (2)
- Gesellschaft für Informatik e.V (2)
- HTWG Konstanz (2)
- IADIS (2)
- IADIS Press (2)
- IBM Research Division (2)
- International Association for Development of the Information Society (2)
- International Society for Photogrammetry and Remote Sensing (2)
- PeerJ Ltd. (2)
- Smart Home & Living Baden-Württemberg e.V. (2)
- Springer Vieweg (2)
- Taylor & Francis (2)
- The Association for Computing Machinery, Inc. (2)
- Thieme (2)
- University of Hawaii (2)
- University of the West of Scotland (2)
- Universität Stuttgart (2)
- 3m5.Media GmbH (1)
- AIP Publishing (1)
- ARVO (1)
- Academic Conferences International Limited (1)
- Association for Computing Machinery ACM (1)
- Association of Computing Machinery (1)
- CIDR (1)
- CMP-WEKA-Verlag (1)
- Cambridge University Press (1)
- Circle International (1)
- Copenhagen Business School (1)
- Cornell Universiy (1)
- Cuvillier Verlag (1)
- DIMECC Oy (1)
- DUZ Medienhaus (1)
- Deutsche Gesellschaft für Medizinische Physik (1)
- Deutsche Gesellschaft für die Computer- und Roboterassistierte Chirurgie e.V. (1)
- EDP Sciences (1)
- EMAC (1)
- Ed2.0Work (1)
- Elektronikpraxis, Vogel Business Media GmbH & Co. KG (1)
- EuroMedPress (1)
- Eurographics Association (1)
- Fachausschuß Management der Anwendungsentwicklung und -wartung (1)
- Faculty of Economics (1)
- Faculty of Organization and Informatics, University of Zagreb (1)
- Frontiers Media (1)
- Frontiers Research Foundation (1)
- GBI-Genios (1)
- GITO-Verl. (1)
- German Medical Science Publishing House (1)
- Haufe Group (1)
- Hochschule Heilbronn (1)
- Hochschule der Medien (1)
- IGI Publ. (1)
- IGI Publishing (1)
- IMC Information multimedia communication AG (1)
- Inderscience Publ. (1)
- Inst. of Electrical and Electronics Engineers (1)
- JMIR Publications (1)
- Johannes Kepler University Linz (1)
- Karlsruher Institut für Technologie (1)
- Lund University (1)
- MCB University Press (1)
- MFG Stiftung Baden-Württemberg (1)
- MHP. a Porsche Company (1)
- Morressier (1)
- NextMed (1)
- OpenProceedings (1)
- PLOS (1)
- Pabst Science Publishers (1)
- Pallas Press (1)
- PeerJ (1)
- Riga Technical University (1)
- Routledge, Taylor & Francis Group (1)
- SISSA (1)
- SSRN (1)
- SciKA (1)
- Science and Technology Publications, Lda (1)
- Shaker Verlag (1)
- Society for Science and Education (1)
- Springer Nature (1)
- Springer Science + Business Media (1)
- Technical University (1)
- Technische Universität Darmstadt (1)
- The Association for Computing Machinery (1)
- Univ. de Jaén (1)
- Universidad Carlos III de Madrid (1)
- University of Minho (1)
- University of Portsmouth (1)
- University of Zagreb Faculty of Organization and Informatics (1)
- Universität Leipzig (1)
- Universität Trier (1)
- Universität des Saarlandes (1)
- Univerzita Tomáe Bati (1)
- Wiley (1)
- World Scientific (1)
- World Scientific Publishing (1)
- de Gruyter (1)
- dpunkt-Verlag (1)
- libreriauniversitaria.it.edizioni (1)
- vwh (1)
Enterprise Architectures (EA) consist of a multitude of architecture elements, which relate in manifold ways to each other. As the change of a single element hence impacts various other elements, mechanisms for architecture analysis are important to stakeholders. The high number of relationships aggravates architecture analysis and makes it a complex yet important task. In practice EAs are often analyzed using visualizations. This article contributes to the field of visual analytics in enterprise architecture management (EAM) by reviewing how state-of-the-art software platforms in EAM support stakeholders with respect to providing and visualizing the “right” information for decision-making tasks. We investigate the collaborative decision-making process in an experiment with master students using professional EAM tools by developing a research study. We evaluate the students’ findings by comparing them with the experience of an enterprise architect.
An operating room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and the tasks of their colleagues. The entire team must work synchronously at all times. To optimize the overall workflow, a task manager supporting the team was developed. In parallel, a common conceptual design of a business process visualization was developed, which makes all relevant information accessible in real-time during a surgery. In this context an overview of all processes in the operating room was created and different concepts for the graphical representation of these user-dependent processes were developed. This paper describes the concept of the task manager as well as the general concept in the field of surgery.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (longterm electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic TimeWarping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
New business concepts such as Enterprise 2.0 foster the use of social software in enterprises. Especially social production significantly increases the amount of data in the context of business processes. Unfortunately, these data are still an unearthed treasure in many enterprises. Due to advances in data processing such as Big Data, the exploitation of context data becomes feasible. To provide a foundation for the methodical exploitation of context data, this paper introduces a classification, based on two classes, intrinsic and extrinsic data.
Modern enterprises reshape and transform continuously by a multitude of management processes with different perspectives. They range from business process management to IT service management and the management of the information systems. Enterprise Architecture (EA) management seeks to provide such a perspective and to align the diverse management perspectives. Therefore, EA management cannot rely on hierarchic - in a tayloristic manner designed - management processes to achieve and promote this alignment. It, conversely, has to apply bottom-up, information-centered coordination mechanisms to ensure that different management processes are aligned with each other and enterprise strategy. Social software provides such a bottom-up mechanism for providing support within EAM-processes. Consequently, challenges of EA management processes are investigated, and contributions of social software presented. A cockpit provides interactive functions and visualization methods to cope with this complexity and enable the practical use of social software in enterprise architecture management processes.
Leveraging textual information for improving decision making in the business process lifecycle
(2015)
Business process implementations fail, because requirements are elicited incompletely. At the same time, a huge amount of unstructured data is not used for decision-making during the business process lifecycle. Data from questionnaires and interviews is collected but not exploited because the effort doing so is too high. Therefore, this paper shows how to leverage textual information for improving decision making in the business process lifecycle. To do so, text mining is used for analyzing questionnaires and interviews.
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology and enterprise systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates collaborative decision mechanisms for adaptive digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support.
Excellence in IT is both a driver and a key enabler of the digital transformation. The digital transformation changes the way we live, work, learn, communicate, and collaborate. The Internet of Things (IoT) fundamentally influences today’s digital strategies with disruptive business operating models and fast changing markets. New business information systems are integrating emerging Internet of Things infrastructures and components. With the huge diversity of Internet of Things technologies and products organizations have to leverage and extend previous Enterprise Architecture efforts to enable business value by integrating Internet of Things architectures. Both architecture engineering and management of current information systems and business models are complex and currently integrating beside the Internet of Things synergistic subjects, like Enterprise Architecture in context with services & cloud computing, semantic-based decision support through ontologies and knowledge-based systems, big data management, as well as mobility and collaboration networks. To provide adequate decision support for complex business/IT environments, we have to make transparent the impact of business and IT changes over the integral landscape of affected architectural capabilities, like directly and transitively impacted IoT-objects, business categories, processes, applications, services, platforms and infrastructures. The paper describes a new metamodel-based approach for integrating Internet of Things architectural objects, which are semi-automatically federated into a holistic Digital Enterprise Architecture environment.
Excellence in IT is a key enabler for the digital transformation of enterprises. To realize the vision of digital enterprises it is necessary to cope with changing business requirements and to align business and IT. In order to evaluate the contribution of enterprise architecture management to these goals, our paper explores the impact of various factors to the perceived benefit of EAM in enterprises. Based on literature, we build an empirical research model. It is tested with empirical data of European EAM experts using a structural equation modelling approach. It is shown that changing business requirements, IT business alignment, the complexity of information technology infrastructure as well as enterprise architecture knowledge of information technology employees are crucial impact factors to the perceived benefit of EAM in enterprises.
Workshop Java EE 7 : ein praktischer Einstieg in die Java Enterprise Edition mit dem Web Profile
(2015)
Dieses Arbeitsbuch bietet Ihnen eine praktische Einführung in die Entwicklung von Business- Anwendungen mit Java EE 7. Schrittweise erstellen Sie eine einfach nachvollziehbare Beispielanwendung auf Grundlage des Web Profile. Dabei lernen Sie alle wichtigen Technologien und Konzepte von Java EE 7 kennen, u.a.: Grafische Oberflächen mit JavaServer Faces und HTML5; Business-Logik mit CDI und EJB; Persistenz mit JPA; Kommunikation mit REST, SOAP und WebSockets; Erweiterte Konzepte wie Resource Library Contracts, Interceptors, Transaktionen, Timer und Security. Über Java EE 7 hinaus wird auch auf weitere praxisrelevante Themen wie Build Management und Testing eingegangen. Das Deployment wird auf den Applikationsservern WildFly 8 und Glassfish 4 sowie über das Cloud-Angebot OpenShift durchgeführt. Am Ende einer jeden Entwicklungsphase finden Sie Übungen und Fragen zur Lernkontrolle.Nach der erfolgreichen Lektüre sind Sie in der Lage, Java-EE-7-Anwendungen selbständig aufzusetzen, zu entwickeln und auf einem Anwendungsserver zu verteilen. Kenntnisse in der Entwicklung mit Java werden vorausgesetzt. Grundlagen von HTML und der Architektur von Webanwendungen sind hilfreich. In der 2. Auflage wird nun auch die Internationalisierung sowie die Erstellung funktionaler Tests mit Graphene behandelt.
The Seventh International Conferences on Pervasive Patterns and Applications (PATTERNS 2015), held between March 22-27, 2015 in Nice, France, continued a series of events targeting the application of advanced patterns, at-large. In addition to support for patterns and pattern processing, special categories of patterns covering ubiquity, software, security, communications, discovery and decision were considered. It is believed that patterns play an important role on cognition, automation, and service computation and orchestration areas. Antipatterns come as a normal output as needed lessons learned.
Die digitale Transformation bezieht sich auf die zunehmende Digitalisierung von Inhalten und Prozessen und die steigende Bedeutung digitaler Medien in Wirtschaft und Gesellschaft. Dabei wird der Wandel u. a. durch die Evolution in der Nutzung des Internets getrieben. Während in der Phase des so genannten Web 1.0 die Publikation und Verbreitung statischer Inhalte im Fokus stand, werden durch das Web 2.0 überwiegend Prozesse der dezentralen Erzeugung und einfachen Verbreitung von User Generated Content stimuliert. Unternehmen müssen auf diese Veränderungen reagieren, um die eigene Wettbewerbsfähigkeit nachhaltig abzusichern. Der vorliegende Beitrag konzentriert sich auf die Weiterentwicklung des Kundenservice. Dieser wurde in den zurückliegenden Jahren von vielen Unternehmen überwiegend als Kostenfaktor mit geringer strategischer Bedeutung eingestuft. Diese Sichtweise hat sich in der digitalen Transformation grundlegend geändert. Kunden können heute Mängel an Produkten und Dienstleistungen durch Foren und Social Media Kanäle sofort und mit hoher Reichweite adressieren. Unternehmen müssen auf den gleichen Kanälen reagieren, um die Multiplikation negativer Sichtweisen einzudämmen und Übertragungseffekte auf traditionelle Medien zu vermeiden. Gleichzeitig entstehen durch digitale Kanäle völlig neue Serviceangebote, die sich nachhaltig auf die unternehmerische Wettbewerbsfähigkeit auswirken. Der vorliegende Beitrag gibt zunächst einen Überblick zu wesentlichen Entwicklungslinien der digitalen Transformation. Auf dieser Grundlage werden die Perspektiven für Unternehmen zur Integration digitaler Medien in die eigene Wertschöpfungskette skizziert. Darüber hinaus steht v. a. die Veränderung des Kundenservice im so genannten Web 2.0 zur Diskussion. Ein Ausblick auf zukünftige Entwicklungen der Digitalisierung rundet den Beitrag entsprechend ab.
Im Kundenbeziehungsmanagement besteht ein großes Interesse an der Nutzung von Social Media. Allerdings finden sich aktuell kaum konzeptionell durchdachte und empirisch überprüfte Lösungen für Social CRM.
Social Media bieten innovative Perspektiven für das Management der Kundenbeziehung. Die Nutzung dieser Möglichkeiten ist jedoch mit hohen Anforderungen an die Marketingstrategie verbunden, was zuweilen vernachlässigt wird.
Die einzelne, allumfassende Managementmethode für ein ganzheitliches Leistungsmanagement gibt es nicht. Vielmehr ist das Zusammenspiel aller erfolgskritischen Managementdisziplinen im Rahmen eines integrativen Managementsystems wichtig, bei dem alle Akteure und Beteiligten auch bei unterschiedlichem Fokus und Sichtweise koordiniert an einem Strang ziehen. Erfolgskritisch ist es jedoch, dass eine unternehmensindividuelle Anpassung mit einem ganzheitlichen Erfahrungshintergrund geplant, komponiert und verzahnt wird. Management Cockpits können als Stufenlösung einen wertvollen Beitrag erbringen, indem sie als Integrationsebene eine Transparenz und Kommunikationsplattform für ein ganzheitliches Leistungsmanagement generieren, selbst wenn die vollständige, fachliche, methodische, prozessuale und technische Integration noch nicht komplett vollzogen bzw. erreicht ist.
Unternehmen benötigen heutzutage im globalen Wettbewerb ein effektives und effizientes Leistungsmanagement, um ihren Erfolg langfristig absichern zu können. Ein solches ganzheitliches und langfristiges Performance Managment kann nur dann die Erwartungen erfüllen, wenn alle erfolgskritischen Management-Disziplinen im Rahmen eines integrativen Managementsystems optimal aufeinander abgestimmt sind.
Der Beitrag zeigt, welche grundlegenden Managementmethoden und -instrumente sich identifizieren lassen, um den Unterschied zwischen dauerhaft erfolgreichen und nicht erfolgreichen Unternehmen zu erklären. In diesem Kontext wird ein Ansatz für einen Leistungsmanagement-Gesamtprozess entwickelt, in dem die zentralen Problemquellen bei der Einführung von Performance Management eingeordnet und erläutert werden.
In order to explore an image, the human eye functions like a spotlight, scanning the content from one object to the next. This visual search behavior is implemented with the help of attention control. The following work surveys the visual search behavior in "Wimmelpictures", a special type of busy pictures. The research objective is to analyze different search strategies and to work out possible differences concerning age and gender. The university experiment is carried out by an eye tracker that records the fixations and saccades of the test persons. The results indicate three forms of search strategy: based on a pattern, based on feature selection, or a mixture of both. Our data shows the search for special features of the target is the most successful. Furthermore there are no differences concerning gender but some concerning age. All age groups need more time to locate the target with an increasing number of distractors in the image. The size of the target is also relevant as a larger target is found more quickly than the smaller one.
Knowledge is an important resource, whose transfer is still not completely understood. The underlying belief of this thesis is that knowledge cannot be transferred directly from one person to another but must be converted for the transfer and therefore is subject to loss of knowledge and misunderstanding. This thesis proposes a new model for knowledge transfer and empirically evaluates this model. The model is based on the belief that knowledge must be encoded by the sender to transfer it to the receiver, who has to decode the message to obtain knowledge.
To prepare for the model this thesis provides an overview about models for knowledge transfer and factors that influence knowledge transfer. The proposed theoretical model for knowledge transfer is implemented in a prototype to demonstrate its applicability. The model describes the influence of the four layers, namely code, syntactic, semantic, and pragmatic layers, on the encoding and decoding of the message. The precise description of the influencing factors and the overlapping knowledge from sender and receiver facilitate its implementation.
The application area of the layered model for knowledge transfer was chosen to be business process modelling. Business processes incorporate an important knowledge resource of an organisation as they describe the procedures for the production of products and services. The implementation in a software prototype allows a precise description of the process by adding semantic to the simple business process modelling language used.
This thesis contributes to the body of knowledge by providing a new model for knowledge transfer, which shows the process of knowledge transfer in greater detail and highlights influencing factors. The implementation in the area of business process modelling reveals the support provided by the model. An expert evaluation indicates that the implementation of the proposed model supports knowledge transfer in business process modelling. The results of the qualitative evaluation are supported by the findings of a qualitative evaluation, performed as a quasi-experiment with a pre-test/post-test design and two experimental groups and one control group. Mann-Whitney U tests indicated that the group that used the tool that implemented the layered model performed significantly better in terms of completeness (the degree of completeness achieved in the transfer) in comparison with the group that used a standard BPM tool (Z = 3.057, p = 0.002, r = 0.59) and the control group that used pen and paper (Z = 3.859, p < 0.001, r = 0.72). The experiment indicates that the implementation of the layered model supports the creation of a business process and facilitates a more precise representation.
In recent years, the rise of the digital transformation received significant importance in Business-to-Business (B2B) research. Social media applications provide executives with a raft of new options. Consequently, interfaces to social media platforms have also been integrated into B2B salesforce applications, although very little is as yet known about their usage and general impact on B2B sales performance. This paper evaluates 1) the conceptualization of social media usage in a dyadic B2B relationship; 2) the effects of a more differentiated usage construct on customer satisfaction; 3) antecedents of social media usage on multiple levels; and 4) the effectiveness of social media usage for different types of customers. The framework presented here is tested cross-industry against data collected from dyadic buyer seller relationships in the IT service industry. The results elucidate the preconditions and the impact of social media usage strategies in B2B sales relations.
Customer services in the digital transformation: social media versus hotline channel performance
(2015)
Due to the digital transformation online service strategies have gained prominence in practice as well as in the theory of service management. This study examines the efficacy of different types of service channels in customer complaint handling. The theoretical framework, developed using complaint handling and social media literature, is tested against data collected from two different channels (hotline and social media) of a German telecommunication service provider. We contribute to the understanding of firm’s multichannel distribution strategy in two ways: a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels, and b) by testing the impact of complaint handling quality on key performance outcomes like customer loyalty, positive word-of-mouth, and cross purchase intentions.
The stimulation of user engagement has received significant attention in extant research. However, the theory of antecedents for user engagement with an initial electronic word-of-mouth (eWoM) communication is relatively less developed. In an investigation of 576 unique user postings across independent Facebook (FB) communities for two German firms, we contribute to the extant knowledge on user engagement in two different ways. First, we explicate senders’ prior usage experience and the extent of their acquaintance with other community members as the two key drivers of user engagement across a product and a service community. Second, we reveal that these main effects differ according to the type of community. In service communities, experience has a stronger impact on user engagement; whereas, in product communities, acquaintance is more important.
Social media usage in business-to-business sales : conceptualization, antecedents, and outcomes
(2015)
In recent years, the rise of social media received significant importance in marketing research. Social media applications now provide executives with a raft of new options. Consequently, interfaces to social media platforms have also been integrated into Business to-Business (B2B) salesforce applications, although very little is as yet known about their usage and general impact on B2B sales performance. This paper evaluates 1) the conceptualization of social media usage in a dyadic B2B relationship; 2) the effects of a more differentiated usage construct on customer satisfaction; 3) antecedents of social media usage on multiple levels; and 4) the effectiveness of social media usage for different types of customers. The framework presented here is tested cross-industry against data collected from dyadic buyer seller relationships in the IT service industry. The results elucidate the preconditions and the impact of social media usage strategies in B2B sales relations.
In recent years, the rise of social media received significant importance in marketing research and practice. Consequently, interfaces to social media platforms have also been integrated into Business-to-Business (B2B) salesforce applications, although very little is as yet known about their usage and general impact on B2B sales performance. This paper evaluates 1) the conceptualization of social media usage in dyadic B2B relationships; 2) the effects of a more differentiated usage construct on customer satisfaction; 3) antecedents of social media usage on multiple levels; and 4) the effectiveness of social media usage for different types of cus-tomers. The framework presented here is tested cross-industry against data collected from dyadic buyer-seller relationships in the IT service industry. The results elucidate the precondi-tions and the impact of social media usage strategies in B2B sales relations.
Viele Unternehmen befassen sich in jüngster Zeit mit der Nutzung von Social Media für die interne Kommunikation und Zusammenarbeit. So genannte Enterprise Social Networks bieten integrierte Plattformen mit Profilen, Blogs, Gruppen- und Kommentarfunktionen für die unternehmensinterne Anwendung. Sehr häufig sind damit umfangreiche Investitionen verbunden. Die Budgets werden im Kern für die IT verwendet, "weiche Faktoren" bleiben häufig außen vor. Ein schwerer Fehler, wie aktuelle Marktstudien zeigen. Etliche der ambitionierten Projekte drohen daher zu scheitern.
Purpose: This paper aims to conceptualize and empirically test the determinants of service interaction quality (SIQ) as attitude, behavior and expertise of a service provider (SP). Further, the individual and simultaneous effects of SIQ and its dimensions on important marketing outcomes are tested. Design/methodology/approach – The narrative review of extant research helps formulate a conceptual model of SIQ, which is investigated using the univariate and multivariate meta-analysis.
Findings: There are interdependencies between drivers of SIQ that underlines the need to conceptualize service interaction as a dyadic phenomenon; use contemporary multilevel models, dyadic models, non-linear structural equation modeling and process studies; and study new and diverse services contexts. Meta-analysis illustrates the relative importance of the three drivers of SIQ and, in turn, their impact on consumer satisfaction and loyalty.
Research limitations/implications – The meta-analysis is based on existing research, which, unfortunately, has not examined critical services or exigency situations where SIQ is of paramount importance. Future research will be tasked with diversifying to several important domains where SIQ is a critical aspect of perceived service quality.
Practical implications: This study emphasizes that, although the expertise of an SP is important, firms would be surprised to learn that the attitude and behavior of their employees are equally important antecedents. In fact, there is a delicate balance that needs to be found; otherwise, attitudinal factors can have an overall counterproductive effect on consumer satisfaction.
Originality/value: This paper provides an empirical synthesis of SIQ and opens up interesting areas for further research.
Management nowadays is confronted by a variety of information originating from either internal or external sources. Thereby, the difficulty to focus on the relevant and company critical keyfigures information increases. In practice, information management is often a major weakness of efficient corporate management. That weakness is caused by the lack of a centralized, categorized and summarized presentation and analysis of strategy and decision-relevant information. Management cockpits, a kind of information center for managers, are an approach to meet the challenges of information management. They are a specific work environment for decision makers to get a quick and simple overview of the company’s economic situation. In the most completely equipped premises, the entire process is supported - from acquiring information, to analysis, decision-making, and communication. Use of management cockpits, a cross-functional, KPI-based and strategyoriented controlling and management process, can be successfully established in companies as well as the work of interdisciplinary management teams, which are supported. In order to provide these possibilities, the management cockpit is equipped with a range of functionalities that allow the structuring, categorization and management-adequate visualization of information along with extensive analysis and simulation options. Management cockpits, as a communication and collaboration platform, are a starting point and valuable process companion on the way to holistic and sustainable performance management.
Das Provisioning Tool automaIT wurde prototypisch um die Möglichkeit eines Data Discovery erweitert, mit dem Ziel, nicht durch automaIT verwaltete Systeme anbinden und steuern zu können. Daten aus dem Data Discovery werden mittels dem Tool Facter gesammelt und können dynamisch in ausführbare Modelle von automaIT integriert und ausgewertet werden. Dadurch kann der Verlauf weiterer Provisionierungsschritte gesteuert werden, ohne dass es eines manuellen Eingriffs bedarf.
Die DGCH registriert vermehrt Klagen aus der klinischen Praxis hinsichtlich der nicht vollständigen Vernetzung bzw. Integration von Gerätesystemen im Chirurgischen OP. Die Anzahl, der Funktionsumfang und der Komplexitätsgrad der verwendeten Geräte nehmen ständig zu und machen die Bedienung immer aufwendiger und damit schwieriger und fehleranfälliger, sodass eine Verbesserung bei der Unterstützung im Ablauf wünschenswert ist. Die Sektion Computer- und telematikassistierte Chirurgie (CTAC) der DGCH hat es auf Veranlassung des Generalsekretärs deshalb übernommen, eine aktuelle Bestandsaufnahme vorzunehmen und mögliche Ansätze zur Verbesserung des derzeitigen Status zu bewerten.
Neue Modelle für digitale Unternehmensarchitekturen mit Big Data, Services & Cloud Computing, mobilen Systemen, Internet of Things sowie Industrie 4.0 Ökosystemen machen eine enge Kooperation verschiedener Partner aus Wissenschaft, Anwendungsunternehmen, öffentlichen Organisationen, Softwarehersteller und IT- Dienstleister notwendig. Ziel dieser Zusammenarbeit ist die Zusammenführung neuer Konzepte und Möglichkeiten der Informationstechnologie zur bestmöglichen Unterstützung sich verändernder Unternehmensziele und -strategien. Software- und Unternehmensarchitekturen spielen hierbei eine zentrale Rolle. So werden Anforderungen bezüglich Flexibilität und Agilität in digitalen Unternehmen wesentlich durch serviceorientierte Ansätze unterstützt. Der Ordnungsgrad und die kosteneffiziente Gestaltung komplexer IT-Landschaften soll durch Digital Enterprise Architecture Management deutlich verbessert werden – passend zu neuen Möglichkeiten von Services & Cloud Computing, Big Data, sowie kollaborativen Geschäftsprozessen.
Im Fokus der Arbeit steht die Unterstützung der Stentgraftauswahl bei endovaskulärer Versorgung eines infrarenalen Aortenaneurysmas. Im Rahmen der Arbeit wurde eine Methode zur Auswertung von Ergebnissen einer Finite Elemente-Analyse zum Stentgraftverhalten konzipiert, implementiert und im Rahmen einer deutschlandweiten Benutzerstudie mit 16 Chirurgen diskutiert. Die entwickelte Mensch-Maschine-Schnittstelle ermöglicht dem Gefäßmediziner eine interaktive Analyse berechneter Fixierungskräfte und Kontaktzustände mehrerer Stentgrafts im Kontext mit dem zu behandelnden Aortenabschnitt. Die entwickelte Methode ermöglicht eine tiefergehende Auseinandersetzung der Mediziner mit numerischen Simulationen und Stentgraftbewertungsgrößen. Hierdurch konnte im Rahmen der Benutzerstudie das Einsatzpotenzial numerischer Simulationen zur Unterstützung der Stentgraftauswahl ermittelt und eine Anforderungsspezifikation an ein System zur simulationsbasierten Stentgraftplanung definiert werden. Im Ergebnis wurde als wesentliches Einsatzpotenzial die Festlegung eines Mindestmaßes an Überdimensionierung, die Optimierung der Schenkellänge von bifurkativen Stentgrafts sowie der Vergleich unterschiedlicher Stentgraftdesigns ermittelt. Zu den wesentlichen Funktionen eines Systems zur simulationsbasierten Stentgraftauswahl gehören eine Übersichtskarte zu farbkodiertem Migrationsrisiko pro Stentgraft und Landungszone, die Visualisierung des Abdichtungszustandes der Stentkomponenten sowie die Darstellung von Stentgraft- und Gefäßdeformationen im 3D-Modell.
In modern times markets are very dynamic. This situation requires agile enterprises to have the ability to react fast on market influences. Thereby an enterprise’ IT is especially affected, because new or changed business models have to be realized. However, enterprise architectures (EA) are complex structures consisting of many artifacts and relationships between them. Thus analyzing an EA becomes to a complex task for stakeholders. In addition, many stakeholders are involved in decision-making processes, because Enterprise Architecture Management (EAM) targets providing a holistic view of the enterprise. In this article we use concepts of Adaptive Case Management (ACM) to design a decision-making case consisting of a combination of different analysis techniques to support stakeholders in decision-making. We exemplify the case with a scenario of a fictive enterprise.
This paper presents a concurrency control mechanism that does not follow a ‘one concurrency control mechanism fits all needs’ strategy. With the presented mechanism a transaction runs under several concurrency control mechanisms and the appropriate one is chosen based on the accessed data. For this purpose, the data is divided into four classes based on its access type and usage (semantics). Class O (the optimistic class) implements a first-committer-wins strategy, class R (the reconciliation class) implements a first-n-committers-win strategy, class P (the pessimistic class) implements a first reader-wins strategy, and class E (the escrow class) implements a firsnreaderswin strategy. Accordingly, the model is called OjRjPjE. Under this model the TPC-C benchmark outperforms other CC mechanisms like optimistic Snapshot Isolation.
Die Energiewende bietet reichlich Fragen für verschiedenste Wissenschaftsdisziplinen einschließlich der Informatik und Wirtschaftsinformatik (WI). Bedauerlicherweise wurde bisher der Bereich der regionalen Energiegenossenschaften und kleinerer Energieversorgungsunternehmen weitgehend von der WI-Forschung vernachlässigt. Der vorliegende Beitrag stellt die aktuelle Situation dieser Organisationen dar und konzentriert sich auf die bestehende Wissenslücke von Geschäftsmodellen (GM) für Energiegenossenschaften (EG) als Zusammenschluss aus Privatpersonen oder kleinen Unternehmen, welche primär regionale, erneuerbare Energie produzieren. Die Modell- und Theorieentwicklung basiert auf der klassischen Literaturrecherche, Fallstudien in der Energiewirtschaft (EW), sowie grafischer Modellierung. Als Ergebnis wird das Referenzgeschäftsmodell einer EG als morphologischer Business Model Canvas vorgestellt. Dieses singuläre GM wird um die Darstellung des Wertschöpfungsnetzwerks, welches die strukturelle Einbindung der Akteure in das digitale Ökosystem der EG berücksichtigt, erweitert. Das aus der Forschung resultierende Referenzmodell dient der kritischen Überprüfung empirisch vorfindbarer GM und zur weiteren Entwicklung von Unternehmensarchitekturen digitaler Unternehmensverbünde.
Das digitale Unternehmen erfordert neue Konzepte des Digital Enterprise Computing. Dieses umfasst eine interdisziplinäre Verbindung von Vorgehensweisen aus der Informatik, der Ökonomie und weiteren relevanten Wissenschaftsdisziplinen. Neue Architekturen mit integrierten Mobility-Systemen, kollaborativen Geschäftsprozessen, Big Data und Cloud-Ökosystemen beflügeln aktuelle und künftige Geschäftsstrategien und machen die digitale Transformation zu neuen Geschäftsfeldern erst möglich. Dafür ist eine enge Kooperation verschiedener Partner aus Wissenschaft, Wirtschaft und Gesellschaft notwendig. Die Jahreskonferenz Digital Enterprise Computing positioniert die Gesellschaft für Informatik als wissenschaftlichen Mitveranstalter und vertieft Erfahrungen aus dem Arbeitskreis Enterprise Architecture Management der Fachgruppe Architekturen im Fachbereich Softwaretechnik der Gesellschaft für Informatik.
In a world with rapidly changing customer requirements and the increased role of technology, companies need more flexible systems to adapt their processes and react dynamically to changes. Adaptive Case Management (ACM) comes into consideration by providing a concept to adapt to changing business conditions. Within our research project we did a first foundational evaluation of the potential of ACM in supporting unpredictable sales processes. Based on a set of criteria we tested the concept of ACM with the open source tool Cognoscenti. The evaluation gave us the possibility to experience the concept of ACM. Hence we were able to provide a statement about the potential of ACM within the context of an unpredictable sales process, setting the path to further research and discussion of ACM in the area of sales processes.
The Seventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2015), held between May 24-29, 2015 in Rome, Italy, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base Technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and Agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Interdisziplinarität ist zwar in aller Munde, ist allerdings häufig schwer praktizierbar. Jedoch erfolgt interessante Forschung häufig an den Schnittstellen einzelner Gebiete. Als Besucher der Konferenz erwarten Sie Beiträge aus unterschiedlichsten Bereichen, wie zum Beispiel e-Learning, automatische Emotionserkennung und -animation, der Mensch-Roboter Interaktion, Fahrerassistenzsysteme, Mechanismen der Wahrnehmung in Virtuellen Welten und der Verarbeitung von digitalen Menschmodellen. Die vorgestellten Arbeiten sind entweder an der Informatik-Fakultät selbst oder extern in Zusammenarbeit mit einem forschenden Unternehmen bzw. mit einem Forschungsinstitut entstanden. Darüber hinaus werden Arbeiten von anderen Fakultäten präsentiert.
Scroll-activated animations eröffnen Webentwicklern neue Möglichkeiten der Interaktion und Präsentation von Inhalten. Durch die Animation von Bildern, Texten und weiteren Elementen einer Website soll der Nutzer durch die neue Darstellungsart positiv überrascht werden. Ziel ist es, dem Nutzer die Inhalte interessanter und möglichst gezielt zu vermitteln. Es stellt sich jedoch die Frage, ob die dadurch gesteigerte User Experience zulasten der Usability erfolgt. Unter Umständen führen die Animationen beim Nutzer zwar zu einem Aha-Effekt, setzen jedoch die Benutzerfreundlichkeit herab. Aus diesem Grund geht die Arbeit auf den Aspekt der Usability und User Experience dieser Animationen ein und untersucht den tatsächlichen Mehrwert des Einsatzes von Scroll-Animationen mithilfe von Webanalysetools. Durch den Vergleich mit einer inhaltlich identischen Seite sollen die oben genannten Effekte untersucht werden. Zusätzlich sollen die Ergebnisse nach Gerätetypen aufgeschlüsselt werden, um mögliche Unterschiede aufzudecken.
Gescannte Menschmodelle werden zunehmend für Experimente im VR-Bereich verwendet. Doch realistische Bewegungsabläufe bereitzustellen, ist eine zeitaufwendige Arbeit. Ziel der Ausarbeitung ist es, einen Workflow zu finden, der es ermöglicht, eine große Anzahl solcher Modelle innerhalb kürzester Zeit zu verarbeiten. Dafür betrachtet die Arbeit unterschiedliche Methoden zum Automatisieren von Skinning und Rigging, um Modelle in virtuellen Umgebungen auf Basis von Motion Tracking einsetzen zu können. Die Qualität der verarbeiteten Modelle wird anhand von Scans in unterschiedlichen Posen geprüft.
Wo treffe ich meine Kunden? Was lerne ich aus dem Feedback meiner User? Wie messe ich Erfolg? Im Sozialnetzwerk muss man die richtigen Fragen stellen, sagt Internet-Forscher Prof. Alexander Rossmann. Seine Studie Auf der Suche nach dem Return on Social Media an der Uni St. Gallen sorgte einst für Furore.
EAM ist ein holistischer Ansatz, um komplexe IT- und Unternehmensstrukturen darzustellen. Dabei ist es von zentraler Bedeutung, diese Strukturen möglichst komplett und übersichtlich zu visualisieren. Ein Ansatz, dies zu erreichen, ist eine multiperspektivische Darstellung von mehreren Views in einem Architekturcockpit. Dabei können mehrere Views simultan betrachtet und analysiert werden. Dadurch ist es möglich, die Auswirkungen einer Analyse des Views eines Stakeholders simultan aus den Views anderer Stakeholder betrachten zu können, um eventuelle Wechselwirkungen zu erkennen und einen allgemeinen Überblick über die Unternehmensarchitektur zu behalten. In dieser Arbeit zeigen wir, von der Konzeption über die Umsetzung bis zu einem Anwendungsbeispiel, wie ein solches Architekturcockpit realisiert werden kann.
Distraction of the driver is one of the most frequent causes for car accidents. We aim for a computational cognitive model predicting the driver’s degree of distraction during driving while performing a secondary task, such as talking with co-passengers. The secondary task might cognitively involve the driver to differing degrees depending on the topic of the conversation or the number of co-passengers. In order to detect these subtle differences in everyday driving situations, we aim to analyse in-car audio signals and combine this information with head pose and face tracking information. In the first step, we will assess driving, video and audio parameters reliably predicting cognitive distraction of the driver. These parameters will be used to train the cognitive model in estimating the degree of the driver’s distraction. In the second step, we will train and test the cognitive model during conversations of the driver with co-passengers during active driving. This paper describes the work in progress of our first experiment with preliminary results concerning driving parameters corresponding to the driver’s degree of distraction. In addition, the technical implementation of our experiment combining driving, video and audio data and first methodological results concerning the auditory analysis will be presented. The overall aim for the application of the cognitive distraction model is the development of a mobile user profile computing the individual distraction degree and being applicable also to other systems.
Informationstechnische Systeme, die den Arbeitsablauf im klinischen Bereich unterstützen, sind aktuell auf organisatorische Abläufe beschränkt. Diese Arbeit stellt einen ersten Ansatz vor, wie solch ein System in den perioperativen Bereich eingebracht werden kann. Hierzu wurde eine Workflow Engine mit einer perioperativen Prozess-Visualisierung verknüpft. Das System wurde nach Modell-View-Controller-Prinzip implementiert. Als "Controller" kommt die Workflow Engine zum Einsatz; also "Modell" ein Prozessmodell, mit den erforderlichen klinischen Daten. Der "View" wurde durch eine abgekoppelte Anwendung realisiert, welche auf Web-Technologien basiert. Drei Visualisierungen, die Workflow Engine sowie die Anbindung beider über eine Datenbankschnittstelle, wurden erfolgreich umgesetzt. Bei den drei Visualisierungen wurden jeweils eine Ansicht für den OP-Koordinator, den Springer und eine Ansicht für die Übersicht einer OP erstellt.
Im Rahmen der Vernetzung des Autos drängen neue Wettbewerber in die Automobilindustrie. Mittels disruptiver Innovationsmethoden haben Google, Apple, Facebook und Co. bereits Branchen grundlegend verändert und Marktführer wie Nokia oder Otto innerhalb weniger Jahre abgelöst. Die folgende Arbeit befasst sich mit diesen Methoden und der Fragestellung, wie sie in den automotiven Produktentstehungsprozess integriert werden können, um nachhaltig erfolgreiche Geschäftsmodelle am Markt platzieren zu können.
War Anfang des Jahrtausends der Wertbeitrag der IT zum Unternehmenserfolg noch umstritten, so negieren diesen heute nur noch die wenigsten Geschäftsführer. Wie Wertschöpfung durch Alignment von Unternehmens- und IT-Strategie mittels passender IT-Architekturen erzeugt wird, scheint für kleine und mittlere Unternehmen (KMU) verschiedenster Branchen noch immer mysteriös. Besonders fatal ist diese Lücke in den KMU der Kultur- und Kreativwirtschaft, die klassischen Industriesektoren als Innovationslieferanten dienen. An dieser Stelle setzt der vorliegende Bericht an. Er baut auf den Ergebnissen des Forschungsprojekts KonfIT-SSC auf, das in den vergangenen Jahren die Möglichkeit erforschte, mit Produktkonfiguratoren den „strategical fit“ zwischen Business und IT-Strukturen zu bewerkstelligen. Die zentrale Herausforderung bei diesem Vorhaben war es, Daten über Informationssystemstrukturen und die sie bestimmenden Ökosysteme so zu erheben, dass sie einer formalen Modellierung von Regelwerken und der Konfiguration von Geschäftsarchitekturen zugänglich werden. Der vorliegende Bericht liefert Antworten auf die Fragen, wie sich passende IT-Service Strategien für Unternehmen der Kultur- und Kreativwirtschaft erreichen lassen, welchen Beitrag Produktkonfiguratoren dabei liefern können und mit welchen Methoden sich Daten gewinnen lassen, um generische IT-Architekturen für KMU der Kreativbranche definieren zu können. Dabei werden im Verlauf neben den Antworten auf die wissenschaftlichen Fragestellungen auch die Ergebnisse der einzelnen Schritte zur Lösung der Aufgabenstellung in Form eines handelsüblichen Konfigurators dokumentiert. Als Methoden im Rahmen des Vorgehens kommen dabei zur Datengewinnung ein klassischer Literature Review, eine Online-Befragung sowie fünf Fallstudien in kleinen und mittleren Unternehmen der Werbebranche, aber auch Interviews mit Experten zum Einsatz. Bei der Analyse der Daten werden die Modellierung von Wertschöpfungsnetzen (e3value und i*), aber auch die Referenzmodellierung von Unternehmensarchitekturen verwendet. Abschließend wird das Vorgehen bei der Entwicklung der Konfigurationsmodelle (Regelwerke) und der Implementierung erläutert.
Small and medium-sized enterprises (SMEs) play a fundamental role in the economic system of the European Union: SMEs represent over 99 percent of all companies and provide two-thirds of the jobs in the private sector. Their innovativeness and economic success have significant influence on growth, jobs and prosperity in Europe.
Information technologies are regarded as key drivers of innovation in small and medium-sized enterprises (SME). Modern information technologies (IT) offer SMEs today many opportunities to improve their competitiveness and market position. Thus, business processes can be designed efficiently, open up new market segments and strengthen the innovation capacity significantly. However, many SMEs still have difficulties in utilizing these new technologies efficiently in order to foster process and product innovation. This is partly due to the fact that many SMEs don’t use IT Service Management and waste resources in running basic IT-functions like the maintenance of printers, software or servers.
Information Technology Service Management (ITSM) is a discipline for managing IT systems centred on the customer’s perspective of IT’s contribution to the business. Thus, by strengthening the performance of SME’s IT departments, ITSM enables process innovation (e.g. eProcurement) and product innovations (e.g. client services) can be promoted. The EU-funded project "IT Service Management for small and medium-sized Enterprises of the Danube Region" (ITSM4SME) aims to make SMEs in the Danube Region aware of the potential of ITSM, to inspire SMEs about the use of information technology and to allow IT-enabled innovations. The aims of the project have been achieved inter alia through a simplified method for IT service management for small IT organisations, practical case studies, a "do-it-yourself" service management modelling tool, an eLearning portal and by training more than 300 participants from SMEs in pilot training courses in Bulgaria, Romania and Slovenia.
Saving energy and protecting the environment became fundamental for society and politics, why several laws were enacted to increase the energy-efficiency. Furthermore, the growing number of vehicles and drivers leaded to more accidents and fatalities on the roads, why road safety became an important factor as well. Due to the increasing importance of energy-efficiency and safety, car manufacturers started to optimise the vehicle in terms of energy-effciency and safety. However, energy-efficiency and road safety can be also increased by adapting the driving behaviour to the given driving situation. This thesis presents a concept of an adaptive and rule based driving system that tries to educate the driver in energy-efficient and safe driving by showing recommendations on time. Unlike existing driving-systems, the presented driving system considers energy-efficiency and safety relevant driving rules, the individual driving behaviour and the driver condition. This allows to avoid the distraction of the driver and to increase the acceptance of the driving system, while improving the driving behaviour in terms of energy-efficiency and safety. A prototype of the driving system was developed and evaluated. The evaluation was done on a driving simulator using 42 test drivers, who tested the effect of the driving system on the driving behaviour and the effect of the adaptiveness of the driving system on the user acceptance. It has been proven during the evaluation that the energy-efficiency and safety can be increased, when the driving system was used. Furthermore, it has been proven that the user acceptance of the driving system increases when the adaptive feature was turned on. A high user acceptance of the driving system allows a steady usage of the driving system and, thus, a steady improvement of the driving behaviour in terms of energy-efficiency and safety.
The Internet of Things (IoT) fundamentally influences today’s digital strategies with disruptive business operating models and fast changing markets. New business information systems are integrating emerging Internet of Things infrastructures and components. With the huge diversity of Internet of Things technologies and products organizations have to leverage and extend previous enterprise architecture efforts to enable business value by integrating the Internet of Things into their evolving Enterprise Architecture Management environments. Both architecture engineering and management of current enterprise architectures is complex and has to integrate beside the Internet of Things synergistic disciplines like EAM - Enterprise Architecture and Management with disciplines like: services & cloud computing, semantic-based decision support through ontologies and knowledge-based systems, big data management, as well as mobility and collaboration networks. To provide adequate decision support for complex business/IT environments, it is necessary to identify affected changes of Internet of Things environments and their related fast adapting architecture. We have to make transparent the impact of these changes over the integral landscape of affected EAM-capabilities, like directly and transitively impacted IoT-objects, business categories, processes, applications, services, platforms and infrastructures. The paper describes a new metamodel-based approach for integrating partial Internet of Things objects, which are semi-automatically federated into a holistic Enterprise Architecture Management environment.
Enterprise architecture management (EAM) is a holistic approach to tackle the complex Business and IT architecture. The transformation of an organization’s EA towards a strategy-oriented system is a continuous task. Many stakeholders have to elaborate on various parts of the EA to reach the best decisions to shape the EA towards an optimized support of the organizations’ capabilities. Since the real world is too complex, analyzing techniques are needed to detect optimization potentials and to get all information needed about an issue. In practice visualizations are commonly used to analyze EAs. However these visualizations are mostly static and do not provide analyses. In this article we combine analyzing techniques from literature and interactive visualizations to support stakeholders in EA decision-making.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the digital transformation since years. The Internet of Things, social collaboration systems for adaptive case management, mobility systems and services for Big Data in cloud services environments are emerging to support intelligent user-centered and social community systems. They will shape future trends of business innovation and the next wave of information and communication technology. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems with service-oriented enterprise architectures. The present research investigates mechanisms for flexible adaptation and evolution of digital enterprise architectures in the context of integrated synergistic disciplines like distributed service-oriented architectures and information systems, EAM - Enterprise Architecture and Management, metamodeling, semantic echnologies, web services, cloud computing and Big Data technology. Our aim is to support flexibility and agile transformations for both business domains and related enterprise systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems.
The character of knowledge-intense processes is that participants decide the next process activities on base of the present information and their expert knowledge. The decisions of these knowledge workers are in general non-deterministic. It is not possible to model these processes in advance and to automate them using a process engine of a BPM system. Hence, in this context a process instance is called a case, because there is no predefined model that could be instantiated. Domain-specific or general case management systems are used to support the knowledge workers. These systems provide all case information and enable users to define the next activities, but they have no or only limited activity recommendation capabilities. In the following paper, we present a general concept for a self-learning system based on process mining that suggests the next best activity on quantitative and qualitative data for a given case. As a proof of concept, it was applied to the area of insurance claims settlement.
Location-based services in buildings represent a great advantage for people to search places, products or people. In our paper we examine the feasibility of Bluetooth iBeacons for indoor localization. In the first part we define and evaluate the iBeacon technology through different experiments. In the second part our solution application is described. Our system is able to estimate the position of the user’s smartphone based on RSSI measurements. Therefore we used the built-in smartphone sensor and a building map with required sender information. Trilateration is used as positioning technique in contrast to fingerprinting to minimize beforehand effort. Results are promising but cannot reach the same accuracy level as sensor-fusion or fingerprinting approaches.
Business processes are important knowledge resources of a company. The knowledge contained in business processes impart procedures used to create products and services. However, modelling and application of business processes are affected by problems connected to knowledge transfer. This paper presents and implements a layered model to improve the knowledge transfer. Thus modelling and understanding of business process models is supported. An evaluation of the approach is presented and results and other areas of application are discussed.
Enterprise Architecture (EA) management is an activity that seeks to foster the alignment of business and IT, and pursues various goals further operationalizing this alignment. Key to effective EA management is a framework that defines the roles, activities, and viewpoints used for EA management in accordance to the concerns that the stakeholders aim to address. Consensus holds that such frameworks are organization-specific and hence they are designed in governance activities for EA management. As of today, top-down approaches for governance are used to derive organization-specific frameworks. These usually lack systematic mechanisms for improving the framework based on the feedback of the responsible stakeholders. We outline a bottom-up approach for EA management governance that systematically observes the behavior of the actors to learn user concerns and recommend appropriate viewpoints. With this approach, we complement traditional top-down governance activities.
A sequence of transactions represents a complex and multi dimensional type of data. Feature construction can be used to reduce the data´s dimensionality to find behavioural patterns within such sequences. The patterns can be expressed using the blue prints of the constructed relevant features. These blue prints can then be used for real time classification on other sequences.
Decision-making in the field of Enterprise Architecture (EA) is a complex task. Many organizations establish a set of complex processes and hierarchical structures to enable strategy-driven development of their EA. This leads to slow and inefficient decision-making entailing bad time-to-market and discontented stakeholders. Collaborative EA delineates a lightweight approach to enable EA decisions but often neglects strategic alignment. In this paper, we present an approach to integrate the concept of collaborative EA and goal-driven decision-making through collaborative modeling of goal-oriented information demands based on ArchiMate’s motivation extension to reach a goal-oriented EA decision support in a collaborative EA environment.
Managers recognize that software development project teams need to be developed and guided. Although technical skills are necessary, non-technical (NT) skills are equally, if not more, necessary for project success. Currently, there are no proven tools to measure the NT skills of software developers or software development teams. Behavioral markers (observable behaviors that have positive or negative impacts on individual or team performance) are beginning to be successfully used by airline and medical industries to measure NT skill performance. The purpose of this research is to develop and validate the behavior marker system tool that can be used by different managers or coaches to measure the NT skills of software development individuals and teams. This paper presents an empirical study conducted at the Software Factory where users of the behavior marker tool rated video clips of software development teams. The initial results show that the behavior marker tool can be reliably used with minimal training.
Ziel der wissenschaftlichen Vertiefung ist es, ein Konzept einer Benutzeroberfläche für ein Fahrassistenzsystem zu entwickeln und zu evaluieren. Das Fahrassistensystem soll dem Fahrer dabei helfen, sicher und energieeffizient zu fahren. Aufgabe ist es, ein Darstellungskonzept zu erstellen und zu evaluieren. Dabei sind die besonderen Anforderungen an Sekundärinteraktionen im Fahrzeug zu berücksichtigen. Ziel der konzeptionellen Phase ist es, eine möglichst ablenkungsfreihe Darstellung zu entwickeln. Dazu werden Normen, Guidelines und Standards der In-Car Interaction zusammenfassend beschrieben und angewendet. Ergebnis ist ein modular umsetzbares Darstellungskonzept, dessen Ablenkungsfreiheit durch einen Lane- Change-Test evaluiert wird.
Die Wahrnehmung unermesslicher Weite kann Ehrfurcht beim Menschen auslösen. Dies kann positive Reaktionen im Menschen zur Folge haben. Während Ehrfurcht theoretisch und praktisch bereits gut erforscht ist, gibt es nur sehr wenig Forschung zum Thema der unermesslichen Weite. Dieses Wissen wäre nützlich, um gezielt Ehrfurcht beim Menschen auszulösen. Aus diesem Grunde wurde eine Studie durchgeführt, mit der festgestellt werden soll, in wie weit sich ein Gefühl unermesslicher Weite in virtueller Realität unter Verwendung eines Head-Mounted Displays erzeugen lässt und ob dadurch Ehrfurcht entsteht.
Two Stream Hypothesis: Adaptationseffekte bei sozialen Interaktionen mit Avataren in Virtual Reality
(2015)
In diesem Paper wird ein Experiment zur Two-Streams-Hypothese vorgestellt. Dabei werden zunächst die psychologischen und technischen Grundlagen erarbeitet, welche für das Experiment benötigt werden. Anschließend wird die Forschungsfrage definiert und der Versuchsaufbau erörtert. Im Experiment soll getestet werden, ob es unterschiedliche Adaptationseffekte bei der Erkennung und dem Ausführen von nicht-eindeutigen sozialen Handlungen gibt. Es wird ein Versuchsaufbau entwickelt, bei welchem Probanden entweder aktiv durch komplementäre Handlungen auf die Handlungen von virtuellen Avataren reagieren sollen oder passiv durch das Drücken von Buttons. Abschließend werden die Ergebnisse ausgewertet und ein Fazit
gezogen.
Mit dem Kunstbegriff "Virtuelle Realität" beschreibt man die Darstellung von künstlichen Welten und die Interaktion mit den selbigen. Meist verbindet man damit teure Spiel- und Filmproduktionen. Doch durch derzeitige Entwicklungen können auch kleine Entwicklerstudios und Endanwender auf Bewegungserkennungssysteme zurückgreifen. In dieser Ausarbeitung werden zwei Prototypen vorgestellt, die auf eben diese Systeme zurückgreifen. In den Prototypen soll eine Interaktion mit der Umwelt und ein "Mittendringefühl" im Rahmen von Serious Games ermöglicht werden.
Background and purpose: Transapical aortic valve replacement (TAVR) is a recent minimally invasive surgical treatment technique for elderly and high-risk patients with severe aortic stenosis. In this paper,a simple and accurate image-based method is introduced to aid the intra-operative guidance of TAVR procedure under 2-D X-ray fluoroscopy.
Methods: The proposed method fuses a 3-D aortic mesh model and anatomical valve landmarks with live 2-D fluoroscopic images. The 3-D aortic mesh model and landmarks are reconstructed from interventional X-ray C-arm CT system, and a target area for valve implantation is automatically estimated using these aortic mesh models.Based on template-based tracking approach, the overlay of visualized 3-D aortic mesh model, land-marks and target area of implantation is updated onto fluoroscopic images by approximating the aortic root motion from a pigtail catheter motion without contrast agent. Also, a rigid intensity-based registration algorithm is used to track continuously the aortic root motion in the presence of contrast agent.Furthermore, a sensorless tracking of the aortic valve prosthesis is provided to guide the physician to perform the appropriate placement of prosthesis into the estimated target area of implantation.
Results: Retrospective experiments were carried out on fifteen patient datasets from the clinical routine of the TAVR. The maximum displacement errors were less than 2.0 mm for both the dynamic overlay of aortic mesh models and image-based tracking of the prosthesis, and within the clinically accepted ranges. Moreover, high success rates of the proposed method were obtained above 91.0% for all tested patient datasets.
Conclusion: The results showed that the proposed method for computer-aided TAVR is potentially a helpful tool for physicians by automatically defining the accurate placement position of the prosthesis during the surgical procedure.
Detecting the adherence of driving rules in an energy-efficient, safe and adaptive driving system
(2016)
An adaptive and rule-based driving system is being developed that tries to improve the driving behavior in terms of the energy-efficiency and safety by giving recommendations. Therefore, the driving system has to monitor the adherence of driving rules by matching the rules to the driving behavior. However, existing rule matching algorithms are not sufficient, as the data within a driving system is changing frequently. In this paper a rule matching algorithm is introduced that is able to handle frequently changing data within the context of the driving system. 15 journeys were used to evaluate the performance of the rule matching algorithms. The results showed that the introduced algorithm outperforms existing algorithms in the context of the driving system. Thus, the introduced algorithm is suited for matching frequently changing data against rules with a higher performance, why it will be used in the driving system for the detection of broken energy-efficiency of safety-relevant driving rules.
In the last decades, several driving systems were developed to improve the driving behaviour in energy efficiency or safety. However, these driving systems cover either the area of energy-efficiency or safety. Furthermore, they do not consider the stress level of the driver when showing a recommendation, although stress can lead to an unsafe or inefficient driving behaviour. In this paper, an approach is presented to consider the driver stress level in a driving system for safe and energy-efficient driving behaviour. The driving system tries to suppress a recommendation when the driver is in stress in order not to stress the driver additionally with recommendations in a stressful driving situation. This can lead to an increase in the road safety and in the user acceptance of the driving system, as the driver is not getting bothered or stressed by the driving system.
The evaluation of the approach showed, that the driving system
is able to show recommendations to the driver, while also reacting
to a high stress level by suppressing recommendations in
order not to stress the driver additionally.
Nowadays there is a rich diversity of sleep monitoring systems available on the market. They promise to offer information about sleep quality of the user by recording a limited number of vital signals, mainly heart rate and body movement. Typically, fitness trackers, smart watches, smart shirts, smartphone applications or patches do not provide access to the raw sensor data. Moreover, the sleep classification algorithm and the agreement ratio with the gold standard, polysomnography (PSG) are not disclosed. Some commercial systems record and store the data on the wearable device, but the user needs to transfer and import it into specialised software applications or return it to the doctor, for clinical evaluation of the data set. Thus an immediate feedback mechanism or the possibility of remote control and supervision are lacking. Furthermore, many such systems only distinguish between sleep and wake states, or between wake, light sleep and deep sleep. It is not always clear how these stages are mapped to the four known sleep stages: REM, NREM1, NREM2, NREM3-4. [1] The goal of this research is to find a reduced complexity method to process a minimum number of bio vital signals, while providing accurate sleep classification results. The model we propose offers remote control and real time supervision capabilities, by using Internet of Things (IoT) technology. This paper focuses on the data processing method and the sleep classification logic. The body sensor network representing our data acquisition system will be described in a separate publication. Our solution showed promising results and a good potential to overcome the limitations of existing products. Further improvements will be made and subjects with different age and health conditions will be tested.
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.
Software process improvement (SPI) has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.
Software development consists to a large extend of humanbased processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the impact of group dynamics on project performance and quality. The course unit uses the Tuckman model as theoretical framework, and borrows from controlled experiments to organize and implement its practical parts in which students then experience the effects of, e.g., time pressure, resource bottlenecks, staff turnover, loss of key personnel, and other stress factors. We provide a detailed design of the course unit to allow for implementation in further software project management courses. Furthermore, we provide experiences obtained from two instances of this unit conducted in Munich and Karlskrona with 36 graduate students. We observed students building awareness of stress factors and developing counter measures to reduce impact of those factors. Moreover, students experienced what problems occur when teams work under stress and how to form a performing team despite exceptional situations.
For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of GSE. From the main study’s result set, a set of 30 papers dealing with GSE was selected for an in-depth analysis using the systematic review instrument to study the contributions and to develop an initial picture of how GSE is considered from the perspective of SPI. Our findings show the analyzed papers delivering a substantial discussion of cultural models and how such models can be used to better address and align SPI programs with multi-national environments. Furthermore, experience is shared discussing how agile approaches can be implemented in companies working at the global scale. Finally, success factors and barriers are studied to help companies implementing SPI in a GSE context.
Software development consists to a large extent of human-based processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the role of different communication patterns in distributed agile software development. In particular, students gain awareness about the importance of communication by experiencing the impact of limitations of communication channels and the effects on collaboration and team performance. The course unit presented uses the controlled experiment instrument to provide the basic organization of a small software project carried out in virtual teams. We provide a detailed design of the course unit to allow for implementation in further courses. Furthermore, we provide experiences obtained from implementing this course unit with 16 graduate students. We observed students struggling with technical aspects and team coordination in general, while not realizing the importance of communication channels (or their absence). Furthermore, we could show the students that lacking communication protocols impact team coordination and performance regardless of the communication channels used.
Sleep is an important aspect in life of every human being. The average sleep duration for an adult is approximately 7 h per day. Sleep is necessary to regenerate physical and psychological state of a human. A bad sleep quality has a major impact on the health status and can lead to different diseases. In this paper an approach will be presented, which uses a long-term monitoring of vital data gathered by a body sensor during the day and the night supported by mobile application connected to an analyzing system, to estimate sleep quality of its user as well as give recommendations to improve it in real-time. Actimetry and historical data will be used to improve the individual recommendations, based on common techniques used in the area of machine learning and big data analysis.
The troubles began when Tom, the business analyst, asked the customer what he wants. The customer came up with good ideas for software features. Tom created a brilliant roadmap and defined the requirements for a new software product. Mary, the development team leader, was already eager to start developing and happy when she got the requirements. She and her team went ahead and created the software right away. Afterwards, Paul tested the software against the requirements. As soon as the software fulfilled the requirements, Linda, the product manager, deployed it to the customer. The customer did not like the software and ignored it. Ringo, the head of software development, was fired. How come? Nowadays, we have tremendous capabilities for creating nearly all kinds of software to fulfill the needs of customers. We can apply agile practices for reacting flexibly to changing requirements, we can use distributed development, open source, or other means for creating software at low cost, we can use cloud technologies for deploying software rapidly, and we can get enormous amounts of data showing us how customers actually use software products. However, the sad reality is that around 90% of products fail, and more than 60% of the features of a typical software product are rarely or never used. But there is a silver lining – an insight regarding successful features: Around 60% of the successes stem from a significant change of an initial idea. This gives us a hint on how to build the right software for users and customers.
Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs. In this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of SQM (including testing). From the main study’s result set, 92 papers were selected for an in-depth systematic review to study the contributions and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models and a strong focus on custom review, testing, and documentation techniques, whereas a set of five selected improvement measures is almost equally addressed.
Context: Companies need capabilities to evaluate the customer value of software intensive products and services. One way of systematically acquiring data on customer value is running continuous experiments as part of the overall development process. Objective: This paper investigates the first steps of transitioning towards continuous experimentation in a large company, including the challenges faced. Method: We conduct a single-case study using participant observation, interviews, and qualitative analysis of the collected data. Results: Results show that continuous experimentation was well received by the practitioners and practising experimentation helped them to enhance understanding of their product value and user needs. Although the complexities of a large multi-stakeholder business to-business (B2B) environment presented several challenges such as inaccessible users, it was possible to address impediments and integrate an experiment in an ongoing development project. Conclusion: Developing the capability for continuous experimentation in large organisations is a learning process which can be supported by a systematic introduction approach with the guidance of experts. We gained experience by introducing the approach on a small scale in a large organisation, and one of the major steps for future work is to understand how this can be scaled up to the whole development organisation.
Context: The current transformation of automotive development towards innovation, permanent learning and adapting to changes are directing various foci on the integration of agile methods. Although, there have been efforts to apply agile methods in the automotive domain for many years, a wide-spread adoption has not yet taken place.
Goal: This study aims to gain a better understanding of the forces that prevent the adoption of agile methods.
Method: Survey based on 16 semi-structured interviews from the automotive domain. The results are analyzed by means of thematic coding.
Results: Forces that prevent agile adoption are mainly of organizational, technical and social nature and address inertia, anxiety and context factors. Key challenges in agile adoption are related to transforming organizational structures and culture, achieving faster software release cycles without loss of quality, the importance of software reuse in combination with agile practices, appropriate quality assurance measures, and the collaboration with suppliers and other disciplines such as mechanics.
Conclusion: Significant challenges are imposed by specific characteristics of the automotive domain such as high quality requirements and many interfaces to surrounding rigid and inflexible processes. Several means are identified that promise to overcome these challenges.
Managing software process evolution : traditional, agile and beyond - how to handle process change
(2016)
This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice.
Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation and addresses the questions of which process(es) to use and adapt, and how to organize process improvement programs. Subsequently, Part 2 mainly addresses process modeling. Lastly, Part 3 collects concrete approaches, experiences, and recommendations that can help to improve software processes, with a particular focus on specific lifecycle phases.
This book is aimed at anyone interested in understanding and optimizing software development tasks at their organization. While the experiences and ideas presented will be useful for both those readers who are unfamiliar with software process improvement and want to get an overview of the different aspects of the topic, and for those who are experts with many years of experience, it particularly targets the needs of researchers and Ph.D. students in the area of software and systems engineering or information systems who study advanced topics concerning the organization and management of (software development) projects and process improvements projects.
Information Systems in Distributed Environment (ISDE) is becoming a prominent standard in this globalization era due to advancement in information and communication technologies. The advent of the internet has supported Distributed Software Development (DSD) by introducing new concepts and opportunities, resulting in benefits such as scalability, flexibility, interdependence, reduced cost, resource pools, and usage tracking. The distributed development of information systems as well as their deployment and operation in distributed environments impose new challenges for software organizations and can lead to business advantages. In distributed environments, business units collaborate across time zones, organizational boundaries, work cultures and geographical distances, something that ultimately has led to an increasing diversification and growing complexity of cooperation among units. The real-world practice of developing, deployment and operation of information systems in globally distributed projects has been viewed from various perspectives, though technical and engineering in conjunction with managerial and organizational viewpoints have dominated the researcher’s attention so far. Successful participation in distributed environments, however, is ultimately a matter of the participants understanding and exploiting the particularities of their respective local contexts at specific points in time and exploring practical solutions through the local resources available.
This special issue of the Computer standards & interfaces journal therefore includes papers received from the public call for papers and extended and improved versions of those papers that were selected from the best of the International Workshop on Information Systems in Distributed Environment (ISDE 2014). It aims to serve as a forum to bring together academics, researchers, practitioners and students in the field of distributed information system, by presenting novel developments and lesson learned from real world cases, and to promote the exchange of ideas, discussion and advancement in these areas.
Die Kombination von Softwareproduktlinien und agiler Softwareentwicklung in der Automobilbranche ist vielversprechend. Das Ziel ist hierbei, sowohl die Vorteile agiler Methoden wie kurze Entwicklungszyklen als auch die Vorteile systematischer Wiederverwendung wie beispielsweise das effektive Management von Varianten zu erzielen. Allerdings ist die Kombination auch mit Herausforderungen verbunden und erfordert eine geeignete Einführungs- oder Transformationsstrategie. Basierend auf Erkenntnissen einer Interviewstudie und existierenden Produktlinienentwicklungen werden Herausforderungen und Lösungsideen aufgezeigt.
Analysis of multicellular patterns is required to understand tissue organizational processes. By using a multi-scale object oriented image processing method, the spatial information of cells can be extracted automatically. Instead of manual segmentation or indirect measurements, such as general distribution of contrast or flow, the orientation and distribution of individual cells is extracted for quantitative analysis. Relevant objects are identified by feature queries and no low-level knowledge of image processing is required.
Data collected from internet applications are mainly stored in the form of transactions. All transactions of one user form a sequence, which shows the user´s behaviour on the site. Nowadays, it is important to be able to classify the behaviour in real time for various reasons: e.g. to increase conversion rate of customers while they are in the store or to prevent fraudulent transactions before they are placed. However, this is difficult due to the complex structure of the data sequences (i.e. a mix of categorical and continuous data types, constant data updates) and the large amounts of data that are stored. Therefore, this thesis studies the classification of complex data sequences. It surveys the fields of time series analysis (temporal data mining), sequence data mining or standard classification algorithms. It turns out that these algorithms are either difficult to be applied on data sequences or do not deliver a classification: Time series need a predefined model and are not able to handle complex data types; sequence classification algorithms such as the apriori algorithm family are not able to utilize the time aspect of the data. The strengths and weaknesses of the candidate algorithms are identified and used to build a new approach to solve the problem of classification of complex data sequences. The problem is thereby solved by a two-step process. First, feature construction is used to create and discover suitable features in a training phase. Then, the blueprints of the discovered features are used in a formula during the classification phase to perform the real time classification. The features are constructed by combining and aggregating the original data over the span of the sequence including the elapsed time by using a calculated time axis. Additionally, a combination of features and feature selection are used to simplify complex data types. This allows catching behavioural patterns that occur in the course of time. This new proposed approach combines techniques from several research fields. Part of the algorithm originates from the field of feature construction and is used to reveal behaviour over time and express this behaviour in the form of features. A combination of the features is used to highlight relations between them. The blueprints of these features can then be used to achieve classification in real time on an incoming data stream. An automated framework is presented that allows the features to adapt iteratively to a change in underlying patterns in the data stream. This core feature of the presented work is achieved by separating the feature application step from the computational costly feature construction step and by iteratively restarting the feature construction step on the new incoming data. The algorithm and the corresponding models are described in detail as well as applied to three case studies (customer churn prediction, bot detection in computer games, credit card fraud detection). The case studies show that the proposed algorithm is able to find distinctive information in data sequences and use it effectively for classification tasks. The promising results indicate that the suggested approach can be applied to a wide range of other application areas that incorporate data sequences.
This book showcases new and innovative approaches to biometric data capture and analysis, focusing especially on those that are characterized by non-intrusiveness, reliable prediction algorithms, and high user acceptance. It comprises the peer-reviewed papers from the international workshop on the subject that was held in Ancona, Italy, in October 2014 and featured sessions on ICT for health care, biometric data in automotive and home applications, embedded systems for biometric data analysis, biometric data analysis: EMG and ECG, and ICT for gait analysis. The background to the book is the challenge posed by the prevention and treatment of common, widespread chronic diseases in modern, aging societies. Capture of biometric data is a cornerstone for any analysis and treatment strategy. The latest advances in sensor technology allow accurate data measurement in a non-intrusive way, and in many cases it is necessary to provide online monitoring and real-time data capturing to support a patient’s prevention plans or to allow medical professionals to access the patient’s current status. This book will be of value to all with an interest in this expanding field.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
A lot of people need help in their daily life to wash, select and manage their clothing. The goal of this work is to design an assistant system (eKlarA) to support the user by giving recommendations to choose the clothing combinations, to find the clothing and to wash the clothing. The idea behind eKlarA is to generate a system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of the stations: closets, laundry basket and washing machine in one or several places. The system uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. The first prototype of this system has been developed and tested. The test results are presented in this work.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. The Internet of Things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and service-oriented enterprise architectures. Our aim is to support flexibility and agile transformations for both business domains and related information technology. The present research paper investigates mechanisms for decision analytics in the context of multi-perspective explorations of enterprise services and their digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support. The paper’s context focuses on digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight on the example domain – Internet of Things.
The amount of image data has been rising exponentially over the last decades due to numerous trends like social networks, smartphones, automotive, biology, medicine and robotics. Traditionally, file systems are used as storage. Although they are easy to use and can handle large data volumes, they are suboptimal for efficient sequential image processing due to the limitation of data organisation on single images. Database systems and especially column-stores support more stuctured storage and access methods on the raw data level for entiere series.
In this paper we propose definitions of various layouts for an efficient storage of raw image data and metadata in a column store. These schemes are designed to improve the runtime behaviour of image processing operations. We present a tool called column-store Image Processing Toolbox (cIPT) allowing to easily combine the data layouts and operations for different image processing scenarios.
The experimental evaluation of a classification task on a real world image dataset indicates a performance increase of up to 15x on a column store compared to a traditional row-store (PostgreSQL) while the space consumption is reduced 7x. With these results cIPT provides the basis for a future mature database feature.
In this paper we present our work in progress on revisiting traditional DBMS mechanisms to manage space on native Flash and how it is administered by the DBA. Our observations and initial results show that: the standard logical database structures can be used for physical organization of data on native Flash; at the same time higher DBMS performance is achieved without incurring extra DBA overhead. Initial experimental evaluation indicates a 20% increase in transactional throughput under TPC-C, by performing intelligent data placement on Flash, less erase operations and thus better Flash longevity.
Purpose – This paper aims to complement the current understanding about user engagement in electronic word-of-mouth (eWoM) communications across online services and product communities. It examines the effect of the senders’ prior experience with products and services, and their extent of acquaintance with other community members, on user engagement with the eWoM.
Design/methodology/approach – The study used a sample of 576 unique user postings from the corporate fan page of two German firms: a service community of a telecom provider and a product community of a car manufacturer. Multiple regression analysis is used to test the conceptual model.
Findings – Senders’ prior experience and acquaintance positively affect user engagement with eWoM, and these effects differ across communities for products and services and across their influence on “likes” and “comments”. The results also suggest that communities for products are orientated toward information sharing, while those discussing services engage in information building.
Research limitations/implications – This research explains mechanisms of user engagement with eWoM and opens directions for future research around motives, content and social media tools within the structures of online communities. The insights on information-handling dimensions of online tools and antecedents to their use contribute to the research on two prioritized topics by the Marketing Science Institute – "Measuring and
Communicating the Value of Online Marketing Activities and Investments" and "Leveraging Digital/Social/Mobile Technology".
Practical implications – This research offers insights for firms to leverage user engagement and facilitate eWoM generation through members who have a higher number of acquaintances or who have more experience with the product or service. Executives should concentrate their community engagement strategies on the identification and utilization of power users. The conceptualization and empirical test about the role of likes and comments will help social media managers to create and better capture value from their social media metrics.
Originality/value – The insights about the underlying factors that influence engagement with eWoM advance our understanding about the usage of online content.
Industrie 4.0 - Ausblick
(2016)
Für Unternehmen ist es wichtig, frühzeitig die strategischen Weichen für ihre Industrie 4.0-Stoßrichtung zu stellen und Erfahrung im Umgang mit Industrie 4.0-Technologien aufzubauen. Allerdings werden einige der Industrie 4.0-relevanten Technologien voraussichtlich erst in 5 bis 10 Jahren ihr Effizienzpotential voll ausschöpfen können. Die Einführung von Industrie 4.0 betrifft nahezu alle Bereiche eines Unternehmens und ist deshalb nicht nur als digitale Transformation, sondern auch als Kulturwandel in der Organisation zu verstehen, zu planen und aktiv zu managen. Themen wie Datenschutz und IT-Sicherheit sind nicht nur wichtige Voraussetzungen für eine erfolgreiche Industrie 4.0-Einführung, sondern müssen als wesentliche Akzeptanz- und Erfolgsfaktoren konsequent und durchgängig in den digitalen Systemen verankert werden.
Significant advances have been achieved in mobile robot localization and mapping in dynamic environments, however these are mostly incapable of dealing with the physical properties of automotive radar sensors. In this paper we present an accurate and robust solution to this problem, by introducing a memory efficient cluster map representation. Our approach is validated by experiments that took place on a public parking space with pedestrians, moving cars, as well as different parking configurations to provide a challenging dynamic environment. The results prove its ability to reproducibly localize our vehicle within an error margin of below 1% with respect to ground truth using only point based radar targets. A decay process enables our map representation to support local updates.
Wie sieht eine erfolgreiche Einführung von Industrie 4.0 aus? Dieses Buch stellt das Konzept, die Paradigmen und relevanten Technologien von Industrie 4.0 sowie deren Gesamtzusammenhänge systematisch vor. Entgegen der gängigen, rein technologischen und anwendungsbezogenen Betrachtungsweise, führt das Buch zusätzlich strategische, taktische und operative Betrachtungsebenen zu einem integrativen Strang zusammen. Zentrales Herzstück dabei ist ein Vorgehensmodell, das den Handlungsbedarf auf strategischer und operativer Ebene beschreibt. Ein Praxisfall, unterschiedliche Industrie 4.0-Use Cases und namhafte Experten aus Forschung und Praxis machen diese Lektüre interessant für Neueinsteiger, aber auch für Umsetzungsinteressierte des mittleren und oberen Managements, die eine neue Sichtweise auf die Komplexität des Themas gewinnen möchten. Das Glossar macht das Buch zum wertvollen Nachschlagewerk für das Thema Industrie 4.0.
KMUs sehen sich häufig aus finanziellen Gründen nicht in der Lage, in grundlegende Technologien der Industrie 4.0 zu investieren. So wird als Hauptvorbehalt eine vermeintlich schlechte Kosten-Nutzen-Relation bzw. langfristige Pay-Back-Zyklen angegeben. Die aktuellen Herausforderungen liegen derzeit eher bei der immer weiter voranschreitenden Internationalisierung sowie dem ansteigenden Innovationsdruck durch den Wettbewerb. Natürlich ist bekannt, dass die zunehmende Vernetzung der Produktionsanlagen in der Industrie 4.0 zudem Risiken in der IT- und Datensicherheit mit sich bringt. Auch Datenqualitäts-, Stabilitäts-, Schnittstellenprobleme oder rechtliche Probleme sind ausschlaggebend für die Verunsicherung der Unternehmen. Durch die zukünftig immer weiter ansteigende Vernetzung zwischen Unternehmen und Stakeholdern, müssen sich insbesondere Zulieferunternehmen in der Pflicht sehen, das Thema Industrie 4.0 aufzugreifen und sich damit auseinander zu setzen. Gerade diese Unternehmen müssen sich vor Augen führen, dass sie nur durch den zukünftigen Einsatz geeigneter Informations- und Kommunikationstechnologien noch in der Lage sein werden, Teil der Wertschöpfungskette zwischen ihren Kunden und Lieferanten zu sein.
The internet of things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud environments are emerging to support smart connected i.e. digital products and services and the digital transformation. Biological metaphors for living and adaptable ecosystems are currently providing the logical foundation for resilient run-time environments with serviceoriented digitization architectures and for self-optimizing intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution of information systems with digital architecture in the context of the ongoing digital transformation. The goal is to support flexible and agile transformations for both business and related information systems through adaptation and dynamical evolution of their digital architectures. The present research paper investigates mechanisms of decision analytics for digitization architectures, putting a spotlight to internet of things micro-granular architectures, by extending original enterprise architecture reference models with digitization architectures and their multi-perspective architectural decision management.
Wie digital ist ein Unternehmen aufgestellt? Wie weit ist es im Vergleich mit anderen Unternehmen der Branche? Um dies zu eruieren, eignen sich digitale Reifegradmodelle. Sie bieten eine Beschreibung der Ist-Situation, regen zur Reflexion über die wichtigen Fragen der Digitalisierung an und zeigen, welche Faktoren sich beeinflussen. Kontinuierlich eingesetzt lassen sie sich als Monitoring des digitalen Transformationsprozesses nutzen.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Sociotechnical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle complex business and IT architectures. The transformation of an organization’s EA is influenced by big data transformation processes and their data-driven approach on all layers. In this paper, we review big data literature to analyze which requirements for the EA management discipline are proposed. Based on a systematic literature identification, conceptual categories of requirements for EA management are elicited utilizing an inductive category formation. These conceptual categories of requirements constitute a category system that facilitates a new perspective on EA management and fosters the innovation-driven evolution of the EA management.
discipline.
Analysis is an important part of the enterprise architecture management process. Prior to decisions regarding transformation of the enterprise architecture, the current situation and the outcomes of alternative action plans have to be analysed. Many analysis approaches have been proposed by researchers and current enterprise architecture management tools implement analysis functionalities. However, few work has been done structuring and classifying enterprise architecture analysis approaches. This paper collects and extends existing classification schemes, presenting a framework for enterprise architecture analysis classification. For evaluation, a collection of enterprise architecture analysis approaches has been classified based on this framework. As a result, the description of these approaches has been assessed, a common set of important categories for enterprise architecture analysis classification has been derived and suggestions for further development are drawn.
On the way to achieving higher degrees of autonomy for vehicles in complicated, ever changing scenarios, the localization problem poses a very important role. Especially the Simultaneous Localization and Mapping (SLAM) problem has been studied greatly in the past. For an autonomous system in the real world, we present a very cost-efficient, robust and very precise localization approach based on GraphSLAM and graph optimization using radar sensors. We are able to prove on a dynamically changing parking lot layout that both mapping and localization accuracy are very high. To evaluate the performance of the mapping algorithm, a highly accurate ground truth map generated from a total station was used. Localization results are compared to a high precision DGPS/INS system. Utilizing these methods, we can show the strong performance of our algorithm.
Reliable and accurate car driver head pose estimation is an important function for the next generation of advanced driver assistance systems that need to consider the driver state in their analysis. For optimal performance, head pose estimation needs to be non-invasive, calibration-free and accurate for varying driving and illumination conditions. In this pilot study we investigate a 3D head pose estimation system that automatically fits a statistical 3D face model to measurements of a driver’s face, acquired with a low-cost depth sensor on challenging real-world data. We evaluate the results of our sensor-independent, driver-adaptive approach to those of a state-of-the-art camera-based 2D face tracking system as well as a non-adaptive 3D model relative to own ground-truth data, and compare to other 3D benchmarks. We find large accuracy benefits of the adaptive 3D approach.