Informatik
Refine
Year of publication
- 2016 (81) (remove)
Document Type
- Conference proceeding (41)
- Book chapter (23)
- Journal article (10)
- Anthology (4)
- Book (2)
- Doctoral Thesis (1)
Is part of the Bibliography
- yes (81)
Institute
- Informatik (81)
Publisher
- Springer (31)
- Gesellschaft für Informatik e.V (14)
- IEEE (9)
- Hochschule Reutlingen (8)
- Elsevier (5)
- Association for Computing Machinery (1)
- De Gruyter (1)
- Elektronikpraxis, Vogel Business Media GmbH & Co. KG (1)
- Emerald (1)
- IARIA (1)
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.
Software process improvement (SPI) has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.
Software development consists to a large extend of humanbased processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the impact of group dynamics on project performance and quality. The course unit uses the Tuckman model as theoretical framework, and borrows from controlled experiments to organize and implement its practical parts in which students then experience the effects of, e.g., time pressure, resource bottlenecks, staff turnover, loss of key personnel, and other stress factors. We provide a detailed design of the course unit to allow for implementation in further software project management courses. Furthermore, we provide experiences obtained from two instances of this unit conducted in Munich and Karlskrona with 36 graduate students. We observed students building awareness of stress factors and developing counter measures to reduce impact of those factors. Moreover, students experienced what problems occur when teams work under stress and how to form a performing team despite exceptional situations.
For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of GSE. From the main study’s result set, a set of 30 papers dealing with GSE was selected for an in-depth analysis using the systematic review instrument to study the contributions and to develop an initial picture of how GSE is considered from the perspective of SPI. Our findings show the analyzed papers delivering a substantial discussion of cultural models and how such models can be used to better address and align SPI programs with multi-national environments. Furthermore, experience is shared discussing how agile approaches can be implemented in companies working at the global scale. Finally, success factors and barriers are studied to help companies implementing SPI in a GSE context.
Software development consists to a large extent of human-based processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the role of different communication patterns in distributed agile software development. In particular, students gain awareness about the importance of communication by experiencing the impact of limitations of communication channels and the effects on collaboration and team performance. The course unit presented uses the controlled experiment instrument to provide the basic organization of a small software project carried out in virtual teams. We provide a detailed design of the course unit to allow for implementation in further courses. Furthermore, we provide experiences obtained from implementing this course unit with 16 graduate students. We observed students struggling with technical aspects and team coordination in general, while not realizing the importance of communication channels (or their absence). Furthermore, we could show the students that lacking communication protocols impact team coordination and performance regardless of the communication channels used.
The internet of things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud environments are emerging to support smart connected i.e. digital products and services and the digital transformation. Biological metaphors for living and adaptable ecosystems are currently providing the logical foundation for resilient run-time environments with serviceoriented digitization architectures and for self-optimizing intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution of information systems with digital architecture in the context of the ongoing digital transformation. The goal is to support flexible and agile transformations for both business and related information systems through adaptation and dynamical evolution of their digital architectures. The present research paper investigates mechanisms of decision analytics for digitization architectures, putting a spotlight to internet of things micro-granular architectures, by extending original enterprise architecture reference models with digitization architectures and their multi-perspective architectural decision management.
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology with more flexible enterprise information systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates the continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, like microservices and the Internet of Things, as part of a new digital enterprise architecture. To integrate micro granular architecture models to living architectural model versions we are extending more traditional enterprise architecture reference models with state of art elements for agile architectural engineering to support the digitization of products, services, and processes.
Wie digital ist ein Unternehmen aufgestellt? Wie weit ist es im Vergleich mit anderen Unternehmen der Branche? Um dies zu eruieren, eignen sich digitale Reifegradmodelle. Sie bieten eine Beschreibung der Ist-Situation, regen zur Reflexion über die wichtigen Fragen der Digitalisierung an und zeigen, welche Faktoren sich beeinflussen. Kontinuierlich eingesetzt lassen sie sich als Monitoring des digitalen Transformationsprozesses nutzen.
On the way to achieving higher degrees of autonomy for vehicles in complicated, ever changing scenarios, the localization problem poses a very important role. Especially the Simultaneous Localization and Mapping (SLAM) problem has been studied greatly in the past. For an autonomous system in the real world, we present a very cost-efficient, robust and very precise localization approach based on GraphSLAM and graph optimization using radar sensors. We are able to prove on a dynamically changing parking lot layout that both mapping and localization accuracy are very high. To evaluate the performance of the mapping algorithm, a highly accurate ground truth map generated from a total station was used. Localization results are compared to a high precision DGPS/INS system. Utilizing these methods, we can show the strong performance of our algorithm.
Reliable and accurate car driver head pose estimation is an important function for the next generation of advanced driver assistance systems that need to consider the driver state in their analysis. For optimal performance, head pose estimation needs to be non-invasive, calibration-free and accurate for varying driving and illumination conditions. In this pilot study we investigate a 3D head pose estimation system that automatically fits a statistical 3D face model to measurements of a driver’s face, acquired with a low-cost depth sensor on challenging real-world data. We evaluate the results of our sensor-independent, driver-adaptive approach to those of a state-of-the-art camera-based 2D face tracking system as well as a non-adaptive 3D model relative to own ground-truth data, and compare to other 3D benchmarks. We find large accuracy benefits of the adaptive 3D approach.
This book presents emerging trends in the evolution of service-oriented and enterprise architectures. New architectures and methods of both business and IT are integrating services to support mobility systems, internet of things, ubiquitous computing, collaborative and adaptive business processes, big data, and cloud ecosystems. They inspire current and future digital strategies and create new opportunities for the digital transformation of next digital products and services. Services Oriented Architectures (SOA) and Enterprise Architectures (EA) have emerged as a useful framework for developing interoperable, large-scale systems, typically implementing various standards, like web services, REST, and microservices. Managing the adaptation and evolution of such systems presents a great challenge. Service-Oriented Architecture enables flexibility through loose coupling, both between the services themselves and between the IT organizations that manage them. Enterprises evolve continuously by transforming and extending their services, processes and information systems. Enterprise Architectures provide a holistic blueprint to help define the structure and operation of an organization with the goal of determining how an organization can most effectively achieve its objectives. The book proposes several approaches to address the challenges of the service-oriented evolution of digital enterprise and software architectures.
The evolution of Services Oriented Architectures (SOA) presents many challenges due to their complex, dynamic and heterogeneous nature. We describe how SOA design principles can facilitate SOA evolvability and examine several approaches to support SOA evolution. SOA evolution approaches can be classified based on the level of granularity they address, namely, service code level, service interaction level and model level. We also discuss emerging trends, such as microservices and knowledge-based support, which can enhance the evolution of future SOA systems.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. The digitization of software-intensive products and services is enabled basically by four megatrends: Cloud computing, big data mobile systems, and social technologies. This disruptive change interacts with all information processes and systems that are important business enablers for the current digital transformation. The internet of things, social collaboration systems for adaptive case management, mobility systems and services for big data in cloud services environments are emerging to support intelligent user-centered and social community systems. Modern enterprises see themselves confronted with an ever growing design space to engineer business models of the future as well as their IT support, respectively. The decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures (EA), is duly needed. With the advent of intelligent user-centered and social community systems, the challenging decision processes can be supported in more flexible and intuitive ways. Tapping into these systems and techniques, the engineers and managers of the enterprise architecture become part of a viable enterprise, i.e. a resilient and continuously evolving system that develops innovative business models.
The Internet of Things (IoT), enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems with service oriented enterprise architectures provide the foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision case management in the context of multi-perspective explorations of enterprise services and Internet of Things architectures by extending original enterprise architecture reference models with state of art elements for architectural engineering for the digitization and architectural decision support.
An enormous amount of data in the context of business processes is stored as images. They contain valuable information for business process management. Up to now this data had to be integrated manually into the business process. By advances of capturing it is possible to extract information from an increasing number of images. Therefore, we systematically investigate the potentials of Image Mining for business process management by a literature research and an in-depth analysis of the business process lifecycle. As a first step to evaluate our research, we developed a prototype for recovering process model information from drawings using Rapidminer.
Preface of IDEA 2015
(2016)
Digitization is more than using digital technologies to transfer data and perform computations and tasks. Digitization embraces disruptive effects of digital technologies on economy and society. To capture these effects, two perspectives are introduced, the product and the value-creation perspective. In the product perspective, digitization enables the transition from material, static products to interactive and configurable services. In the value-creation perspective, digitization facilitates the transition from centralized, isolated models of value creation, to bidirectional, co-creation oriented approaches of value creation.
The Internet of Things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems with service-oriented enterprise architectures. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight with the example domain – Internet of Things.
IT environments that consist of a very large number of rather small structures like microservices, Internet of Things (IoT) components, or mobility systems are emerging to support flexible and agile products and services in the age of digital transformation. Biological metaphors of living and adaptable ecosystems with service-oriented enterprise architectures provide the foundation for self-optimizing, resilient run-time environments and distributed information systems. We are extending Enterprise Architecture (EA) methodologies and models that cover a high degree of heterogeneity and distribution to support the digital transformation and related information systems with micro-granular architectures. Our aim is to support flexibility and agile transformation for both IT and business capabilities within adaptable digital enterprise architectures. The present research paper investigates mechanisms for integrating Microservice Architectures (MSA) by extending original enterprise architecture reference models with elements for more flexible architectural metamodels and EA-mini-descriptions.
Background and purpose: Transapical aortic valve replacement (TAVR) is a recent minimally invasive surgical treatment technique for elderly and high-risk patients with severe aortic stenosis. In this paper,a simple and accurate image-based method is introduced to aid the intra-operative guidance of TAVR procedure under 2-D X-ray fluoroscopy.
Methods: The proposed method fuses a 3-D aortic mesh model and anatomical valve landmarks with live 2-D fluoroscopic images. The 3-D aortic mesh model and landmarks are reconstructed from interventional X-ray C-arm CT system, and a target area for valve implantation is automatically estimated using these aortic mesh models.Based on template-based tracking approach, the overlay of visualized 3-D aortic mesh model, land-marks and target area of implantation is updated onto fluoroscopic images by approximating the aortic root motion from a pigtail catheter motion without contrast agent. Also, a rigid intensity-based registration algorithm is used to track continuously the aortic root motion in the presence of contrast agent.Furthermore, a sensorless tracking of the aortic valve prosthesis is provided to guide the physician to perform the appropriate placement of prosthesis into the estimated target area of implantation.
Results: Retrospective experiments were carried out on fifteen patient datasets from the clinical routine of the TAVR. The maximum displacement errors were less than 2.0 mm for both the dynamic overlay of aortic mesh models and image-based tracking of the prosthesis, and within the clinically accepted ranges. Moreover, high success rates of the proposed method were obtained above 91.0% for all tested patient datasets.
Conclusion: The results showed that the proposed method for computer-aided TAVR is potentially a helpful tool for physicians by automatically defining the accurate placement position of the prosthesis during the surgical procedure.
This paper presents a concurrency control mechanism that does not follow a "one concurrency control mechanism fits all needs" strategy. With the presented mechanism a transaction runs under several concurrency control mechanisms and the appropriate one is chosen based on the accessed data. For this purpose, the data is divided into four classes based on its access type and usage (semantics). Class O (the optimistic class) implements a first-committer-wins strategy, class R (the reconciliation class) implements a first-n-committers-win strategy, class P (the pessimistic class) implements a first-reader-wins strategy, and class E (the escrow class) implements a first-n-readers-win strategy. Accordingly, the model is called OjRjPjE. The selected concurrency control mechanism may be automatically adapted at run-time according to the current load or a known usage profile. This run-time adaptation allows OjRjPjE to balance the commit rate and the response time even under changing conditions. OjRjPjE outperforms the Snapshot Isolation concurrency control in terms of response time by a factor of approximately 4.5 under heavy transactional load (4000 concurrent transactions). As consequence, the degree of concurrency is 3.2 times higher.
Dieser Herausgeberband beleuchtet die Erfolgsfaktoren für die Implementierung und den Betrieb von ESN. Der erste Teil des Bandes umfasst akademische Beiträge zur aktuellen Forschung und beinhaltet fundierte wissenschaftliche Konzepte für den Einsatz von ESN. In den Beiträgen des zweiten Teils stehen Konzepte aus der Praxis im Fokus, bevor im dritten Teil Anwendererfahrungen in Form von Fallstudien betrachtet werden. In der digitalen Transformation setzen Unternehmen verstärkt Social Software ein, um positive Wirkungseffekte in den Bereichen Mitarbeiterzufriedenheit, Wissenstransfer, Innovationsdynamik, Produktivität oder Führungsakzeptanz zu erzielen. Diese Enterprise Social Networks (ESN) verändern die interne Kommunikation und erlauben einen ungebremsten Informationsfluss und überwinden organisationale Silos. Jedoch werden die in ESN gesetzten Erwartungen in der Praxis oft nicht erfüllt. Geringe Nutzungsgrade, mangelnde Einbindung in Geschäftsprozesse und unklare Wirkungszusammenhänge induzieren die Frage, ob sich eine Investition in ESN lohnt. Spitzenforschung, Thought Leader in der Unternehmenspraxis und führende Experten beschreiben wesentliche Strategien, Konzepte und Impulse für das Management der Enterprise Social Networks.
Enterprise Social Networks : Einführung in die Thematik und Ableitung relevanter Forschungsfelder
(2016)
Die Relevanz von Enterprise Social Networks (ESN) für den Arbeitsalltag in Wissensorganisationen steigt. Diese Netzwerke unterstützen die Kommunikation, Zusammenarbeit und das Wissensmanagement in Unternehmen. Der vorliegende Beitrag beinhaltet eine Einführung in das Themengebiet ESN und skizziert Einsatzmöglichkeiten, Potenziale und Herausforderungen. Er gibt einen Überblick zu wesentlichen Fachartikeln, die eine Übersicht zu Forschungsarbeiten im Bereich ESN beinhalten. Anschließend werden einzelne Forschungsbeiträge analysiert und weitere Forschungspotenziale abgeleitet. Dies führt zu acht Erfolg versprechenden Bereichen für die weitere Forschung: 1) Nutzerverhalten, 2) Effekte des Einsatzes von ESN, 3) Management, Leadership und Governance für ESN, 4) Wertbestimmung und Erfolgsmessung, 5) kulturelle Auswirkungen, 6) Architektur und Design von ESN, 7) Theorien, Forschungsdesigns und Methoden, sowie 8) weitere Herausforderungen in Bezug auf ESN. Der Beitrag charakterisiert diese Bereiche und formuliert exemplarisch offene Fragestellungen für die zukünftige Forschung.
Digital companies need information systems to implement their business processes end-to-end. BPM systems are promising candidates for that, because they are highly adaptable due to their business process model-driven operation mode. End-to-end processes contain different types of sub-processes that are either procedural, data-driven or business rule-based. Modern BPM systems support modeling notations for all these types of sub-processes. Moreover, end-to-end processes contain parts of shadow processing, so consequently, they must be supported in a performant way, too. BPMN seems to be the adequate notation for modeling these parts due to its procedural nature. Further, BPMN provides several elements that enable the modeling of parallel executions which are very interesting for accelerating shadow processing parts of the process. The present paper will observe the limitations and potentials of BPM systems for a high-performance execution of BPMN models representing shadow processing parts of a business process.
Reality mining refers to an application of data mining, using sensor data to drive behavioral patterns in the real world. However, research in this field started a decade ago when technology was far behind today's state of the art. This paper discusses which requirements are now posed to applications in the context of reality mining. A survey has shown which sensors are available in state-of-the-art smartphones and usable to gather data for reality mining. As another contribution of this paper, a reality mining application architecture is proposed to facilitate the implementation of such applications. A proof of concept verifies the assumptions made on reality mining and the presented architecture.
Nowadays almost every major company has a monitoring system and produces log data to analyse their systems. To perform analysation on the log data and to extract experience for future decisions it is important to transform and synchronize different time series. For synchronizing multiple time series several methods are provided so that they are leading to a synchronized uniform time series. This is achieved by using discretisation and approximation methodics. Furthermore the discretisation through ticks is demonstrated, as well as the respectivly illustrated results.
Many organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Socio-technical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle the complex business and IT architecture. The transformation of an organization's EA is influenced by big data projects and their data-driven approach on all layers. To enable strategy oriented development of the EA it is essential to synchronize these projects supported by EA management. In
this paper, we conduct a systematic review of big data literature to analyze which requirements for the EA management discipline are proposed. Thereby, a broad overview about existing research is presented to facilitate a more detailed exploration and to foster the evolution o the EA management discipline.
Rapidly growing data volumes push today's analytical systems close to the feasible processing limit. Massive parallelism is one possible solution to reduce the computational time of analytical algorithms. However, data transfer becomes a significant bottleneck since it blocks system resources moving data-to-code. Technological advances allow to economically place compute units close to storage and perform data processing operations close to data, minimizing data transfers and increasing scalability. Hence the principle of Near Data Processing (NDP) and the shift towards code-to-data. In the present paper we claim that the development of NDP-system architectures becomes an inevitable task in the future. Analytical DBMS like HPE Vertica have multiple points of impact with major advantages which are presented within this paper.
The second Digital Enterprise Computing Conference DEC 16 at the Herman Hollerith Center in Böblingen brings together students, researchers, and practitioners to discuss solutions, experiences, and future developments for the digital transformation. Digitization of business and IT defines the conference agenda: technology acceptance, digital transformation, digital business & administration, digital process challenges, analytics, and big data & data processing.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
The acquisition of data for reality mining applications is a critical factor, since many mobile devices, e.g. smartphones, must be capable of capturing the required data. Otherwise, only a small target group would be able to use the reality mining application. In the course of a survey, we have identified smartphone features which might be relevant for various reality mining applications. The survey classifies these features and shows how the support of each feature has changed over the years by analyzing 143 smartphones released between 2004 and 2015. All analyzed devices can be ranked by their number of provided features. Furthermore, this paper deals with quality issues which have occurred during carrying out the survey.
Converting users into customers : the role of user profile information and customer journey analysis
(2016)
Due to the digital transformation, the importance of web analysis and user profiling for enterprises is increasing rapidly as customers focus on digital channels to obtain information about products and brands. While there exists a lot research on these topics, only a minority of firms use them to their advantage. This study aims to tighten the link between research and business such that experimental methods can be used for the improvement of communication strategies in practice. Therefore, a systematic literature analysis is conducted, workshops are observed and documented and an empirical study is used to integrate single steps into a framework for the
practical usage of user profiling and customer journey analysis.
The amount of image data has been rising exponentially over the last decades due to numerous trends like social networks, smartphones, automotive, biology, medicine and robotics. Traditionally, file systems are used as storage. Although they are easy to use and can handle large data volumes, they are suboptimal for efficient sequential image processing due to the limitation of data organisation on single images. Database systems and especially column-stores support more stuctured storage and access methods on the raw data level for entiere series.
In this paper we propose definitions of various layouts for an efficient storage of raw image data and metadata in a column store. These schemes are designed to improve the runtime behaviour of image processing operations. We present a tool called column-store Image Processing Toolbox (cIPT) allowing to easily combine the data layouts and operations for different image processing scenarios.
The experimental evaluation of a classification task on a real world image dataset indicates a performance increase of up to 15x on a column store compared to a traditional row-store (PostgreSQL) while the space consumption is reduced 7x. With these results cIPT provides the basis for a future mature database feature.
Unternehmen befassen sich in jüngster Zeit verstärkt mit der Nutzung von Social Media in der internen Kommunikation und Zusammenarbeit. So genannte Enterprise Social Networks (ESN) bieten integrierte Plattformen mit Profilen, Blogs, gemeinsamer Dokumentenverwaltung, Wikis, Chats, Gruppen- und Kommentarfunktionen für die unternehmensinterne Anwendung. Sehr häufig sind damit umfangreiche Investitionen verbunden. Die Budgets werden im Kern für die IT verwendet – „weiche Faktoren“ bleiben häufig außen vor. Dies kann zu erheblichen Problemen bei der Akzeptanz entsprechender Plattformen führen. Daher sind weitere Maßnahmen im Bereich der Steuerung der Einführung und des Betriebs von ESN erforderlich, die sich unter dem Begriff der Governance zusammenfassen lassen. Das Konstrukt Governance bezieht sich auf Art und Umfang der Rollen und Aufgaben zur Steuerung der Nutzung von ESN. Der vorliegende Beitrag beleuchtet mögliche Governancemodelle für die Einführung und Weiterentwicklung von ESN. Die Resultate der vorliegenden Forschung wurden auf der Grundlage einer fundierten Literaturanalyse sowie der explorativen Befragung verantwortlicher Executives für die Nutzung von ESN in deutschen Großunternehmen erzielt. Dabei weisen die Implikationen der qualitativen Datenanalyse auf Zusammenhänge hin, die sich als Ausgangshypothesen für weitere Forschungsarbeiten nutzen lassen.
Bei der Bayer AG wird als Lösung für das Enterprise Social Network IBM Connections eingesetzt. Bayer verfolgt das Ziel, die Mitarbeiter/innen weltweit zu vernetzen, die Kommunikation über Bereichsgrenzen hinweg zu unterstützen und um einen Wissens- und Expertenpool bereitzustellen. Im Rahmen eines Relaunches wurde 2012 Connections@Bayer, das vorher nur in Teilkonzernen verfügbar war, auf das gesamte Unternehmen ausgerollt. In einem weiteren Relaunch 2014 führte das Unternehmen ein Update auf die Version 4.5 und eine umfangreiche Kommunikationskampagne durch, die unter den Mitarbeiter/innen Aufmerksamkeit für die Kommunikationsplattform schuf und Neugier weckte. Darin wurde eine Analyse der Schlüsselvorteile der Nutzung von Connections durchgeführt, acht Kernnachrichten erarbeitet und diese auf diversen Kommunikationskanälen im Unternehmen verbreitet. Zudem ließen sich durch die Verwendung von Testimonials die Vorteile für alle Mitarbeitergruppen darstellen. Dieser Relaunch war erfolgreich: Die Nutzerzahlen konnten erweitert werden, die Mitarbeiterzufriedenheit stieg an. Die vorliegende Fallstudie stellt anschaulich dar, dass ein von einer effektiven Kommunikationskampagne begleiteter Relaunch eines Enterprise Social Networks einen nachhaltigen Erfolg herbeiführen kann.
In this paper we present our work in progress on revisiting traditional DBMS mechanisms to manage space on native Flash and how it is administered by the DBA. Our observations and initial results show that: the standard logical database structures can be used for physical organization of data on native Flash; at the same time higher DBMS performance is achieved without incurring extra DBA overhead. Initial experimental evaluation indicates a 20% increase in transactional throughput under TPC-C, by performing intelligent data placement on Flash, less erase operations and thus better Flash longevity.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Sociotechnical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle complex business and IT architectures. The transformation of an organization’s EA is influenced by big data transformation processes and their data-driven approach on all layers. In this paper, we review big data literature to analyze which requirements for the EA management discipline are proposed. Based on a systematic literature identification, conceptual categories of requirements for EA management are elicited utilizing an inductive category formation. These conceptual categories of requirements constitute a category system that facilitates a new perspective on EA management and fosters the innovation-driven evolution of the EA management.
discipline.
Analysis is an important part of the enterprise architecture management process. Prior to decisions regarding transformation of the enterprise architecture, the current situation and the outcomes of alternative action plans have to be analysed. Many analysis approaches have been proposed by researchers and current enterprise architecture management tools implement analysis functionalities. However, few work has been done structuring and classifying enterprise architecture analysis approaches. This paper collects and extends existing classification schemes, presenting a framework for enterprise architecture analysis classification. For evaluation, a collection of enterprise architecture analysis approaches has been classified based on this framework. As a result, the description of these approaches has been assessed, a common set of important categories for enterprise architecture analysis classification has been derived and suggestions for further development are drawn.
Purpose – This paper aims to complement the current understanding about user engagement in electronic word-of-mouth (eWoM) communications across online services and product communities. It examines the effect of the senders’ prior experience with products and services, and their extent of acquaintance with other community members, on user engagement with the eWoM.
Design/methodology/approach – The study used a sample of 576 unique user postings from the corporate fan page of two German firms: a service community of a telecom provider and a product community of a car manufacturer. Multiple regression analysis is used to test the conceptual model.
Findings – Senders’ prior experience and acquaintance positively affect user engagement with eWoM, and these effects differ across communities for products and services and across their influence on “likes” and “comments”. The results also suggest that communities for products are orientated toward information sharing, while those discussing services engage in information building.
Research limitations/implications – This research explains mechanisms of user engagement with eWoM and opens directions for future research around motives, content and social media tools within the structures of online communities. The insights on information-handling dimensions of online tools and antecedents to their use contribute to the research on two prioritized topics by the Marketing Science Institute – "Measuring and
Communicating the Value of Online Marketing Activities and Investments" and "Leveraging Digital/Social/Mobile Technology".
Practical implications – This research offers insights for firms to leverage user engagement and facilitate eWoM generation through members who have a higher number of acquaintances or who have more experience with the product or service. Executives should concentrate their community engagement strategies on the identification and utilization of power users. The conceptualization and empirical test about the role of likes and comments will help social media managers to create and better capture value from their social media metrics.
Originality/value – The insights about the underlying factors that influence engagement with eWoM advance our understanding about the usage of online content.
Although still in the early stages of diffusion, smartwatches represent the most popular type of wearable devices. Yet, little is known why some people are more likely to adopt smartwatches than others. To deepen the understanding of underlying factors prompting adoption behavior, the authors develop a theoretical model grounded in technology acceptance and social psychology literature. Empirical results reveal perceived usefulness and visibility as important factors that drive intention. The magnitude of these antecedents is influenced by an individual’s perception of viewing smartwatches as a technology and/or as a fashion accessory. Theoretical and managerial implications are discussed.
This paper provides an introduction to the topic of enterprise social networks (ESN) and illustrates possible applications, potentials, and challenges for future research. It outlines an analysis of research papers containing a literature overview in the field of ESN. Subsequently, single relevant research papers are analysed and further research potentials derived therefrom. This yields seven promising areas for further research: (1) user behaviour; (2) effects of ESN usage; (3) management, leadership, and governance; (4) value assessment and success measurement; (5) cultural effects, (6) architecture and design of ESN; and (7) theories, research designs and methods. This paper characterises these areas and articulates further research directions.
Industrie 4.0 - Ausblick
(2016)
Für Unternehmen ist es wichtig, frühzeitig die strategischen Weichen für ihre Industrie 4.0-Stoßrichtung zu stellen und Erfahrung im Umgang mit Industrie 4.0-Technologien aufzubauen. Allerdings werden einige der Industrie 4.0-relevanten Technologien voraussichtlich erst in 5 bis 10 Jahren ihr Effizienzpotential voll ausschöpfen können. Die Einführung von Industrie 4.0 betrifft nahezu alle Bereiche eines Unternehmens und ist deshalb nicht nur als digitale Transformation, sondern auch als Kulturwandel in der Organisation zu verstehen, zu planen und aktiv zu managen. Themen wie Datenschutz und IT-Sicherheit sind nicht nur wichtige Voraussetzungen für eine erfolgreiche Industrie 4.0-Einführung, sondern müssen als wesentliche Akzeptanz- und Erfolgsfaktoren konsequent und durchgängig in den digitalen Systemen verankert werden.
Significant advances have been achieved in mobile robot localization and mapping in dynamic environments, however these are mostly incapable of dealing with the physical properties of automotive radar sensors. In this paper we present an accurate and robust solution to this problem, by introducing a memory efficient cluster map representation. Our approach is validated by experiments that took place on a public parking space with pedestrians, moving cars, as well as different parking configurations to provide a challenging dynamic environment. The results prove its ability to reproducibly localize our vehicle within an error margin of below 1% with respect to ground truth using only point based radar targets. A decay process enables our map representation to support local updates.
Wie sieht eine erfolgreiche Einführung von Industrie 4.0 aus? Dieses Buch stellt das Konzept, die Paradigmen und relevanten Technologien von Industrie 4.0 sowie deren Gesamtzusammenhänge systematisch vor. Entgegen der gängigen, rein technologischen und anwendungsbezogenen Betrachtungsweise, führt das Buch zusätzlich strategische, taktische und operative Betrachtungsebenen zu einem integrativen Strang zusammen. Zentrales Herzstück dabei ist ein Vorgehensmodell, das den Handlungsbedarf auf strategischer und operativer Ebene beschreibt. Ein Praxisfall, unterschiedliche Industrie 4.0-Use Cases und namhafte Experten aus Forschung und Praxis machen diese Lektüre interessant für Neueinsteiger, aber auch für Umsetzungsinteressierte des mittleren und oberen Managements, die eine neue Sichtweise auf die Komplexität des Themas gewinnen möchten. Das Glossar macht das Buch zum wertvollen Nachschlagewerk für das Thema Industrie 4.0.
KMUs sehen sich häufig aus finanziellen Gründen nicht in der Lage, in grundlegende Technologien der Industrie 4.0 zu investieren. So wird als Hauptvorbehalt eine vermeintlich schlechte Kosten-Nutzen-Relation bzw. langfristige Pay-Back-Zyklen angegeben. Die aktuellen Herausforderungen liegen derzeit eher bei der immer weiter voranschreitenden Internationalisierung sowie dem ansteigenden Innovationsdruck durch den Wettbewerb. Natürlich ist bekannt, dass die zunehmende Vernetzung der Produktionsanlagen in der Industrie 4.0 zudem Risiken in der IT- und Datensicherheit mit sich bringt. Auch Datenqualitäts-, Stabilitäts-, Schnittstellenprobleme oder rechtliche Probleme sind ausschlaggebend für die Verunsicherung der Unternehmen. Durch die zukünftig immer weiter ansteigende Vernetzung zwischen Unternehmen und Stakeholdern, müssen sich insbesondere Zulieferunternehmen in der Pflicht sehen, das Thema Industrie 4.0 aufzugreifen und sich damit auseinander zu setzen. Gerade diese Unternehmen müssen sich vor Augen führen, dass sie nur durch den zukünftigen Einsatz geeigneter Informations- und Kommunikationstechnologien noch in der Lage sein werden, Teil der Wertschöpfungskette zwischen ihren Kunden und Lieferanten zu sein.
The Eighth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2016), held between June 26 - 30, 2016 - Lisbon, Portugal, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.