Refine
Document Type
- Conference proceeding (850) (remove)
Language
- English (850) (remove)
Is part of the Bibliography
- yes (850) (remove)
Institute
- Informatik (462)
- Technik (220)
- ESB Business School (137)
- Texoversum (21)
- Life Sciences (11)
- Zentrale Einrichtungen (2)
Publisher
- IEEE (222)
- Springer (142)
- Gesellschaft für Informatik e.V (42)
- Association for Computing Machinery (40)
- Hochschule Reutlingen (31)
- Association for Information Systems (30)
- VDE Verlag (23)
- SciTePress (21)
- IARIA (19)
- Elsevier (18)
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
Customer services in the digital transformation: social media versus hotline channel performance
(2015)
Due to the digital transformation online service strategies have gained prominence in practice as well as in the theory of service management. This study examines the efficacy of different types of service channels in customer complaint handling. The theoretical framework, developed using complaint handling and social media literature, is tested against data collected from two different channels (hotline and social media) of a German telecommunication service provider. We contribute to the understanding of firm’s multichannel distribution strategy in two ways: a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels, and b) by testing the impact of complaint handling quality on key performance outcomes like customer loyalty, positive word-of-mouth, and cross purchase intentions.
This paper addresses the following four research questions: 1. How should customer service quality in social media channels be conceptualized on multiple levels? 2. Which aspects of customer service quality are important in enhancing customer satisfaction? 3. What outcomes are effected by customer service quality and customer satisfaction? 4. How effective are customer services delivered through social media channels (as compared to customer services delivered through other channels)?
At Reutlingen University in Germany students from different countries and disciplines can learn business English within the framework of a theatre production. In the "Business English Theatre" they work in an international project team staging a play with a business focus and thus improve both their language, social and professional skills.
This work is a report on practical experiences with the issue of interoperability in German practice management systems (PMSs) from an ongoing clinical trial on teledermatology, the TeleDerm project. A proprietary and established web-platform for store-and-forward telemedicine is integrated with the IT in the GPs’ offices for automatic exchange of basic patient data. Most of the 19 different PMSs included in the study sample lack support of modern health data exchange standards, therefore the relatively old but widely available German health data exchange interface “Gerätedatentransfer” (GDT) is used. Due to the lack of enforcement and regulation of the GDT standard, several obstacles to interoperability are encountered. As a partial, but reusable working solution to cope with these issues, we present a custom middleware which is used in conjunction with GDT. We describe the design, technical implementation and observed hindrances with the existing infrastructure. A discussion on health care interfacing standards and the current state of interoperability in German PMS software is given.
The diversity of energy prosumer types makes it difficult to create appropriate incentive mechanisms that satisfy both prosumers and energy system operators alike. Meanwhile, European energy suppliers buy guarantees of origin (GoO) which allow them to sell green energy at premium prices while in reality delivering grey energy to their customers. Blockchain technology has proven itself to be a robust paying system in which users transact money without the involvement of a third party. Blockchain tokens can be used to represent a unit of energy and, just as GoOs, be submitted to the market. This paper focuses on simulating marketplace using the ethereum blockchain and smart contracts, where prosumers can sell tokenized GoOs to consumers willing to subsidize renewable energy producers. Such markets bypass energy providers by allowing consumers to obtain tokenized GoOs directly from the producers, which in turn benefit directly from the earnings. Two market strategies where tokens are sold as GoOs have been simulated. In the Fix Price Strategy prosumers sell their tokens to the average GoO price of 2014. The Variable Price Strategy focuses on selling tokens at a price range defined by the difference between grey and green energy. The study finds that the ethereum blockchain is robust enough to functions as a platform for tokenized GoO trading. Simulation results have been compared and the results indicate that prosumers earn significantly more money by following the Variable Price
Strategy.
Context: Nowadays the market environment is characterized by high uncertainties due to high market dynamics, confronting companies with new challenges in creating and updating product roadmaps. Most companies are still using traditional approaches which typically fail in such environments. Therefore, companies are seeking opportunities for new product roadmapping approaches.
Objective: This paper presents good practices to support companies better understand what factors are required to conduct a successful product roadmapping in a dynamic and uncertain market environment.
Method: Based on a grey literature review, essential aspects for conducting product roadmapping in a dynamic and uncertain market environment were identified. Expert workshops were then held with two researchers and three practitioners to develop best practices and the proposed approach for an outcome-driven roadmap. These results were then given to another set of practitioners and their perceptions were gathered through interviews.
Results: The study results in the development of 9 good practices that provide practitioners with insights into what aspects are crucial for product roadmapping in a dynamic and uncertain market environment. Moreover, we propose an approach to product roadmapping that includes providing a flexible structure and focusing on delivering value to the customer and the business. To ensure the latter, this approach consists of the main items outcome hypothesis, validated outcomes, and discovered outputs.
In recent years, the parallel computing community has shown increasing interest in leveraging cloud resources for executing parallel applications. Clouds exhibit several fundamental features of economic value, like on-demand resource provisioning and a pay-per-use model. Additionally, several cloud providers offer their resources with significant discounts; however, possessing limited availability. Such volatile resources are an auspicious opportunity to reduce the costs arising from computations, thus achieving higher cost efficiency. In this paper, we propose a cost model for quantifying the monetary costs of executing parallel applications in cloud environments, leveraging volatile resources. Using this cost model, one is able to determine a configuration of a cloud-based parallel system that minimizes the total costs of executing an application.
Condition Monitoring for mechanical systems like bearings or transmissions is often done by analysing frequency spectra obtained from accelerometers mounted to the components under observation. Although this approach gives a high amount on information about the system behaviour, the interpretation of the resulting spectra requires expert knowledge, that is, a deep understanding of the effect on condition deterioration on the measured spectra. However, an increasing number of condition monitoring applications demands other representations of the measured signals that can be easily interpreted even by non–experts. Therefore, the objective of this paper is to develop an approach for processing measured process data in order to obtain an easy to interpret measure for assessing the component condition. The main idea is to evaluate the deterioration of a component condition by computing the correlation function of current measurements with past measurements in order to detect a component condition deterioration from a change in these correlation functions. Besides the simplicity of the obtained measure, this approach opens the opportunity for integrating a model based approach as well. The developed method is tested based on a condition monitoring application in a roller chain.
This paper investigates the possibility to effectively monitor and control the respiratory action using a very simple and non invasive technique based on a single lightweight reduced-size wireless surface electromyography (sEMG) sensor placed below the sternum. The captured sEMG signal, due to the critical sensor position, is characterized by a low energy level and it is affected by motion artifacts and cardiac noise. In this work we present a preliminary study performed on adults for assessing the correlation of the spirometry signal and the sEMG signal after the removal of the superimposed heart signal. This study and the related findings could be useful in respiratory monitoring of preterm infants.
The presented research is dedicated to estimation of the correlation between the level of renewable energy sources and the costs of congestion management in electric networks in selected European countries. Data of six countries in North-West European area (Italy, Spain, Germany, France, Poland and Austria) were investigated. Factors considered included grid congestion costs including re-dispatching costs as well as countertrading costs, gross electricity generation, installed capacity of electric generating facilities, installed capacity of electric non-dispatchable renewable energy sources and total electricity consumption. Special attention is paid to the share of renewable energy sources. It is found that the grid congestion costs are not clearly affected by penetration of non-dispatchable renewables in all the analysed countries and therefore a clear mathematical correlation cannot no be extrapolated with the available data. The results of this research show in general a loose dependency of the grid congestion costs on the penetration of renewables and a strong dependency on the total electrical consumption of the country.
The automotive industry faces three major challenges – shortage of fossil fuels, politics of global warming and rising competition from new markets. In order to remain competitive companies have to develop more efficient and alternative fuel vehicles that meet the individual requirements of the customers. Functional Integration combined with new Technologies and materials are the key to stable success in this industry. The sustaining upward trend to system innovations within the last ten years confirms this. The development of complex products like automobiles claim skills of various disciplines e.g. engineering, chemistry. Furthermore, these skills are spread all over the supply chain. Hence the only way to stay successful in the automotive industry is cooperation and collaborative innovation. Interdisciplinary and interorganizational development has high demands on cooperation models especially in the automotive industry. In this case study cooperation models are analyzed and evaluated according to their applicability to interdisciplinary, interorganizational development projects in the automotive industry. Following, the research campus ARENA2036 is analyzed. ARENA2036 is an interdisciplinary, interorganizational development project housing automobile manufacturers, suppliers, research establishments and university institutes. Finally, based on interviews with the partners and the precede analyses of cooperation models, suggestions for implementation are given to ARENA2036.
Radiofrequency ablation is an ablation technique to treat tumors with focused heat. Computer tomography, ultrasound and magnetic resonance imaging (MRI) are imaging modalities which can be used for image-guided procedures. MRI offers several advantages in comparison to the other imaging modalities, such as radiation-free fluoroscopic imaging, temperature mapping, a high-soft-tissue contrast and free selection of imaging planes. This work addresses the application of 3Dcontrollers for controlling interventional, fluoroscopic MR sequences at the scenario of MR guided radiofrequency ablation of hepatic malignancies. During this procedure, the interventionalist can monitor the targeting of the tumor with near-real time fluoroscopic sequences. In general, adjustments of the imaging planes are necessary during tumor targeting, which is performed by an assistant in the control room. Therefore, communication between the interventionalist in the scanner room and the assistant in the control room is essential. However, verbal communication is impaired due to the loud scanning noises. Alternatively, non-verbal communication between the two persons is possible, however limited to a few gestures and susceptible to misunderstandings. This work is analyzing different 3D-controllers to enable control of interventional MR sequences during MR-guided procedures directly by the interventionalist. Leap Motion, Wii Remote, SpaceNavigator, Phantom Omni and Foot Switch were selected. For that a simulation was built in C++ with VTK to feign the real scenario for test purposes. Previous results showed that Leap Motion is not suitable for the application while Wii Remote and Foot Switch are possible input devices. Final evaluation showed a generally time reduction with the use of 3D-controllers. Best results were reached with Wii Remote in 34 seconds. Handholding input devices like Wii Remote have further potential to integrate them in real environment to reduce intervention time.
In networked operating room environments, there is an emerging trend towards standardized non-proprietary communication protocols which allow to build new integration solutions and flexible human-machine interaction concepts. The most prominent endeavor is the IEEE 11073 SDC protocol. For some uses cases, it would be helpful if not just medical devices could be controlled based on SDC, but also building automation systems like light, shutters, air condition, etc. For those systems, the KNX protocol is widely used. We build an SDC-to-KNX gateway which allows to use the SDC protocol for sending commands to connected KNX devices. The first prototype system was successfully implemented at the demonstration operating room at Reutlingen University. This is a first step toward the integration of a broader variety of KNX devices.
Rapid value delivery requires a company to utilize empirical evaluation of new features and products in order to avoid unnecessary product risks. This helps to make data-driven decisions and to ensure that the development is focused on features that provide real value for customers. Short feedback loops are a prerequisite as they allow for fast learning and reduced reaction times. Continuous experimentation is a development practice where the entire R&D process is guided by constantly conducting experiments and collecting feedback. Although principles of continuous experimentation have been successfully applied in domains such as game software or SAAS, it is not obvious how to transfer continuous experimentation to the business to-business domain. In this article, a case study from a medium-sized software company in the B2B domain is presented. The study objective is to analyze the challenges, benefits and organizational aspects of continuous experimentation in the B2B domain. The results suggest that technical challenges are only one part of the challenges a company encounters in this transition. The company also has to address challenges related to the customer and organizational culture. Unique properties in each customers business play a major role and need to be considered when designing experiments. Additionally, the speed by which experiments can be conducted is relative to the speed by which production deployments can be made. Finally, the article shows how the study results can be used to modify the development in the case company in a way that more feedback and data is used instead of opinions.
Due to frequently changing requirements, the internal structure of cloud services is highly dynamic. To ensure flexibility, adaptability, and maintainability for dynamically evolving services, modular software development has become the dominating paradigm. By following this approach, services can be rapidly constructed by composing existing, newly developed and publicly available third-party modules. However, newly added modules might be unstable, resource-intensive, or untrustworthy. Thus, satisfying non-functional requirements such as reliability, efficiency, and security while ensuring rapid release cycles is a challenging task. In this paper, we discuss how to tackle these issues by employing container virtualization to isolate modules from each other according to a specification of isolation constraints. We satisfy non-functional requirements for cloud services by automatically transforming the modules comprised into a container-based system. To deal with the increased overhead that is caused by isolating modules from each other, we calculate the minimum set of containers required to satisfy the isolation constraints specified. Moreover, we present and report on a prototypical transformation pipeline that automatically transforms cloud services developed based on the Java Platform Module System into container-based systems.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy-efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decides whether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of the driving system.
Fitting 3D Morphable Face Models (3DMM) to a 2D face image allows the separation of face shape from skin texture, as well as correction for face expression. However, the recovered 3D face representation is not readily amenable to processing by convolutional neural networks (CNN). We propose a conformal mapping from a 3D mesh to a 2D image, which makes these machine learning tools accessible by 3D face data. Experiments with a CNN based face recognition system designed using the proposed representation have been carried out to validate the advocated approach. The results obtained on standard benchmarking data sets show its promise.
This paper describes a new method for condition monitoring of a roller chain. In contrast to conventional methods, no additional accelerometers are used to measure and interpret frequency spectra but the chain condition is evaluated using an easy to interpret similarity measure based on correlation functions using the driving motor torque. An additional clustering of current data and reference measurements yields an easy to understand representation of the chain condition.
Platforms and their surrounding ecosystems are becoming increasingly important components of many companies' strategies. Artificial Intelligence, in particular, has created new opportunities to create and develop ecosystems around the platform. However, there is not yet a methodology to systematically develop these new opportunities for enterprise development strategy. Therefore, this paper aims to lay a foundation for the conceptualization of Artificial Intelligence-based service ecosystems exploiting a Service-Dominant Logic. The basis for conceptualization is the study of value creation and particularly effective network effects. This research investigates the fundamental idea of extending specific digital concepts considering the influence of Artificial Intelligence on the design of intelligent services, along with their architecture of digital platforms and ecosystems, to enable a smooth evolutionary path and adaptability for human-centric collaborative systems and services. The paper explores an extended digital enterprise conceptual model through a combined, iterative, and permanent task of co-creating value between humans and intelligent systems as part of a new idea of cognitively adapted intelligent services.
Decentralisation and increasing energy efficiency are factors of success of the 'Energiewende'. Sensible interlinking of various energy markets will support and speed up the energy system transformation process. This concept study looks at and discusses an innovative approach to integrate power, heat and the mobility market using hybrid vehicles. Automobile electrification is steadily rising and goes hand-in-hand with qualitative (larger energy storage options) and quantitative storage capacity (much more hybrid vehicles). Further utilisation options of electrical storage units in e-vehicles for intermediate storage to compensate volatile renewable energy sources are being discussed and tested. The innovative approach of integrating future full-hybrid vehicles with the principle of 'combined heat and power' to supply energy to buildings is not being pursued in depth, or even at all. In this approach both the electrical and also the thermal energy produced would be used as supply sources for the building.
The blockchain technology represents a decentralized database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain manufacturing processes to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. However, the decentralized and immutable structure of blockchain technology also creates challenges when applying these token-based approaches to dynamic manufacturing processes. As a first step, this paper investigates existing mapping approaches and exemplifies weaknesses regarding their suitability for products with changeable configurations. Secondly, a concept is proposed to overcome these weaknesses by introducing logically coupled tokens embedded into a flexible smart contract structure. Finally, a concept for a token-based architecture is introduced to map manufacturing processes of products with changeable configurations.
Public transport causes in rural areas high costs per passenger and kilometer as the frequency of scheduled busses is low and therefore, many people avoid using public transport. With the trend of moving from urban regions to countryside individual traffic will further increase. To tackle issues of emissions, mobility for young and elderly people and provide economically meaningful public transport a new concept was elaborated in Germany. This consists of (partly) autonomous shuttle busses which are remote controlled. For implementation rural districts of Germany have worked together and set up a three-phase plan consisting of a project with public funding, a highly frequent used pilot region and industrial partners with the commitment and possibilities for necessary investments. The concept promises economical value with respect to installation, service and maintaining costs, it leads to lower barriers for public transport of young and elderly people and ultimately reduces emissions and congestions.
Competing logics in evaluating employee performance : building compromises through conventions
(2015)
Current research argues that competing institutional logistics1 can co-exist enduringly and investigates how organizations cope with such institutional complexity (Greenwood et al. 2011). Thereby, the role of practices for handling competing logics has been overlooked and it is currently only to limited extent understood how organizations establish compromises between competing logics. Therefore, we investigated the recent performance appraisal reform of a German public sector organization that occurred in 2008 (see also Kozica, Brandl 2015). BAND (the pseudonym for our organization) has been using performance appraisals for several decades, and performance appraisals have already become entrenched instruments (Zeitz, Mittal, McAulay 1999) for handling staff promotion decisions. While BAND accepted the accountability logic of the performance appraisal, the professional logic (which is based on trust and comradeship as a high value of being professional in our organization) is accepted too and BAND has established a fine-grained compromise between the different logics. During the recent reform of the performance appraisal system, however, this compromise has broken up and challenged organizational members to (re-)arrange a compromise. By using French convention school of thinking (Boltanski, Thévenot 2006) we address how BAND copes with conflicting logics by forming compromises in organizational practices. Thereby, we show that the concept of convention is particularly promising for understanding of how organizations deal with institutional complexity. More broadly, our argument contributes to the elaboration of an organizational theory for the institutional logics discussion that explains how organizational and individual actions are interlinked.
Shorter product life cycles and emerging technologies are changing the circumstances under which the design of assembly and logistics systems has to be carried out. Engineers are in charge of adapting the production in accordance with the underlying product at a higher pace, oversee a more complex system and find the ideal solution for a functional work system design as well as social interactions between humans and machines in cyber-physical systems. Such collaborative work systems consider the individual capabilities and potentials of humans and machines to combine them in a manner that assists the operator during his daily work routine. To be able to design such work systems, specific competences such as the ability of integrated process and product planning as well as systems and interface competence are required. Learning factories train students as well as professionals to gain such qualifications by providing a close-to-reality learning environment based on a didactical concept which covers all relevant methods for ergonomic work system design and a state-of-the-art infrastructure. Group-based, activity oriented scenarios enable the participants to put the learnings into their everyday work life. Thereby, learning factories have an indirect impact on the transfer of proven best practices to the industry.
The metric and qualitative analysis of models of the upper and lower dental arches is an important aspect of orthodontic treatment planning. Currently available eLearning systems for dental education only allow access to digital learning materials, and do not interactively support the learning progress. Moreover, to date no study compared the efficiency of learning methods based on physical or digital study models. For this pilot study, 18 dental students were separated into two groups to investigate whether the learning success in study model analysis with an interactive elearning system is higher based on digital models or on conventional plaster models. The results show that with the digital method less time is needed per model analysis. Moreover, the digital approach leads to higher total scores than that based on plaster models. We conclude that interactive eLearning using digital dental arch models is a promising tool for dental education.
The integration of renewable energy sources in single family homes is challenging. Advance knowledge of the demand of electrical energy, heat, and domestic hot water (DHW) is useful to schedule projectable devices like heat pumps. In this work, we consider demand time series for heat and DHW from 2018 for a single family home in Germany. We compare different forecasting methods to predict such demands for the next day. While the 1-day-back forecast method led to the prediction of heat demand, the N-day-average performed best for DHW demand when Unbiased Exponentially Moving Average (UEMA) is used with a memory of 2.5 days. This is surprising as these forecasting methods are very simple and do not leverage additional information sources such as weather forecasts.
With the Internet of Things being one of the most discussed trends in the computer world lately, many organizations find themselves struggling with the great paradigm shift and thus the implementation of IoT on a strategic level. The Ignite methodoogy as a part of the Enterprise-IoT project promises to support organizations with these strategic issues as it combines best practices with expert knowledge from diverse industries helping to create a better understanding of how to transform into an IoT driven business. A framework that is introduced within the context of IoT business model development is the Bosch IoT Business Model Builder. In this study the provided framework is compared to the Osterwalder Business Model Canvas and the St. Gallen Business Model Navigator, the most commonly used and referenced frameworks according to a quantitative literature analysis.
Simple MOSFET models intended for hand analysis are inaccurate in deep sub-micrometer process technologies and in the moderate inversion region of device operation. Accurate models, such as the Berkeley BSIM6 model, are too complex for use in hand analysis and are intended for circuit simulators. Artificial neural networks (ANNs) are efficient at capturing both linear and non-linear multivariate relationships. In this work, a straightforward modeling technique is presented using ANNs to replace the BSIM model equations. Existing open-source libraries are used to quickly build models with error rates generally below 3%. When combined with a novel approach, such as the gm/Id systematic design method, the presented models are sufficiently accurate for use in the initial sizing of analog circuit components without simulation.
Detecting semantic similarities between sentences is still a challenge today due to the ambiguity of natural languages. In this work, we propose a simple approach to identifying semantically similar questions by combining the strengths of word embeddings and Convolutional Neural Networks (CNNs). In addition, we demonstrate how the cosine similarity metric can be used to effectively compare feature vectors. Our network is trained on the Quora dataset, which contains over 400k question pairs. We experiment with different embedding approaches such as Word2Vec, Fasttext, and Doc2Vec and investigate the effects these approaches have on model performance. Our model achieves competitive results on the Quora dataset and complements the well-established evidence that CNNs can be utilized for paraphrase recognition tasks.
The potentials and opportunities created by digitized healthcare can be further customized through smart data processing and analysis using accurate patient information. This development and the associated new treatment concepts basing on digital smart sensors can lead to an increase in motivation by applying gamification approaches. This effect can also be used in the field of medical treatment, e.g. with the help of a digital spirometer combined with an app. In one of our exemplary applications, we show how to control an airplane within an app by breathing respectively inhaling and exhaling. Using this biofeedback within a game allows us to increase the motivation and fun for children that need to perform necessary exercises.
The relevance of Robotic Process Automation (RPA) has increased over the last few years. Combining RPA with Artificial Intelligence (AI) can further enhance the business value of the technology. The aim of this research was to analyze applications, terminology, benefits, and challenges of combining the two technologies. A total of 60 articles were analyzed in a systematic literature review to evaluate the aforementioned areas. The results show that by adding AI, RPA applications can be used in more complex contexts, it is possible to minimize the human factor during the development process, and AI-based decision-making can be integrated into RPA routines. This paper also presents a current overview of the used terminology. Moreover, it shows that by integrating AI, some unseen challenges in RPA projects can emerge, but also a lot of new benefits will come along with it. Based on the outcome, it is concluded that the topic offers a lot of potential, but further research and development is required. The result of this study help researches to gain an overview of the state-of-the-art in combining RPA and AI.
Combining agile development and software product lines in automotive: challenges and recommendations
(2018)
Software product lines (SPLs) are used throughout the automotive industry. SPLs help to manage the large number of variants and to improve quality by reuse. In order to develop high quality software faster, agile software development (ASD) practices are introduced. From both the research and the management point of view it is still not clear how these two approaches can be combined. We derive recommendations to combine ASD and SPLs based on challenges identified for an automotive specific model. This study combines the outcome of a literature review and a qualitative interview study with 16 practitioners from the automotive domain. We evaluate the results and analyze the relationship between ASD and SPLs in the automotive domain. Furthermore, we derive recommendations to combine ASD and SPLs based on challenges identified in the automotive domain. This study identifies 86 individual challenges. Important challenges address supplier collaboration and faster software release cycles without loss of quality. The identified challenges and the derived recommendations show that the combination of ASD and SPL in the automotive industry is promising but not trivial. There is a need for an automotive-specific approach that combines ASD and SPL.
While many maintainability metrics have been explicitly designed for service-based systems, tool-supported approaches to automatically collect these metrics are lacking. Especially in the context of microservices, decentralization and technological heterogeneity may pose challenges for static analysis. We therefore propose the modular and extensible RAMA approach (RESTful API Metric Analyzer) to calculate such metrics from machine-readable interface descriptions of RESTful services. We also provide prototypical tool support, the RAMA CLI, which currently parses the formats OpenAPI, RAML, and WADL and calculates 10 structural service-based metrics proposed in scientific literature. To make RAMA measurement results more actionable, we additionally designed a repeatable benchmark for quartile-based threshold ranges (green, yellow, orange, red). In an exemplary run, we derived thresholds for all RAMA CLI metrics from the interface descriptions of 1,737 publicly available RESTful APIs. Researchers and practitioners can use RAMA to evaluate the maintainability of RESTful services or to support the empirical evaluation of new service interface metrics.
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology and enterprise systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates collaborative decision mechanisms for adaptive digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support.
In this paper we presented the results of the workshop with the topic: Co-creation in citizen science (CS) for the development of climate adaptation measurements - Which success factors promote, and which barriers hinder a fruitful collaboration and co-creation process between scientists and volunteers? Under consideration of social, motivational, technical/technological and legal factors., which took place at the CitSci2022. We underlined the mentioned factors in the work with scientific literature. Our findings suggest that a clear communication strategy of goals and how citizen scientists can contribute to the project are important. In addition, they have to feel include and that the contribution makes a difference. To achieve this, it is critical to present the results to the citizen scientists. Also, the relationship between scientist and citizen scientists are essential to keep the citizen scientists engaged. Notification of meetings and events needs to be made well in advance and should be scheduled on the attendees' leisure time. The citizen scientists should be especially supported in technical questions. As a result, they feel appreciated and remain part of the project. For legal factors the current General Data Protection Regulation was considered important by the participants of the workshop. For the further research we try to address the individual points and first of all to improve our communication with the citizen scientist about the project goals and how they can contribute. In addition, we should better share the achieved results.
We present a new method for detecting gait disorders according to their stadium using cluster methods for sensor data. 21 healthy and 18 Parkinson subjects performed the time up and go test. The time series were segmented into separate steps. For the analysis the horizontal acceleration measured by a mobile sensor system was considered. We used dynamic time warping and hierarchical custering to distinguish the stadiums. A specificity of 92% was achieved.
So-called cloud-based management information systems are a fairly new phenomenon in management accounting in recent years. Quite a few companies (and especially their business managers and management accountants) do not always work via the cloud, but with hybrid solutions or on-premise solutions of ERP software such as SAP or Oracle, but often still with "manual" solutions such as Microsoft Excel.
To analyze the humans’ sleep it is necessary as to identify the sleep stages, occurring during the sleep, their durations and sleep cycles. The gold standard procedure for this approach is polysomnography (PSG), which classify the sleep stages based on Rechtschaffen and Kales (R-K) method. This method aside the advantages as high accuracy has however some disadvantages, among others time-consuming and uncomfortable for the patient procedure. Therefore, the development of further methods for the sleep classification in addition to PSG is a promising topic for the investigation and this work has as its aim the presentation of possible ways and goals for this development.
Indoor localization systems are becoming more and more important with the digitalization of the industrial sector. Sensor data such as the current position of machines, transport vehicles, goods or tools represent an essential component of cyber physical production systems (CCPS). However, due to the high costs of these sensors, they are not widespread and are used mainly in special scenarios. However, especially optical indoor positioning systems (OIPS) based on cameras have certain advantages due to their technological specifications. In this paper, the application scenarios and requirements as well as their characteristics are presented and a classification approach of OIPS is introduced.
Classification model of supply chain events regarding their transferability to blockchain technology
(2021)
The blockchain technology represents a decentralized database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain supply chain events to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. However, the decentralized and immutable structure of blockchain technology also creates challenges. In particular, the scalability, storage capacity, and the special requirements for storage formats make it currently impossible to map all supply chain events unrestrictedly on the blockchain. As a first step, this paper identifies important supply chain events for different use cases combining blockchain technology and supply chain management. Secondly, the supply chain events are classified in terms of their expected technical properties and their relevance for the respective use case. Finally, the identified supply chain events are evaluated regarding their transferability to blockchain technology and a classification model is introduced.
New business concepts such as Enterprise 2.0 foster the use of social software in enterprises. Especially social production significantly increases the amount of data in the context of business processes. Unfortunately, these data are still an unearthed treasure in many enterprises. Due to advances in data processing such as Big Data, the exploitation of context data becomes feasible. To provide a foundation for the methodical exploitation of context data, this paper introduces a classification, based on two classes, intrinsic and extrinsic data.
Because of saturated markets and of the low profit margins in the sales of cars, car manufacturers focus more and more on profitable product related services. This paper deals with the question how to classify product related services in the automotive industry and which characteristic product related services are offered to the end-users (consumers) in a standardized format. Two research studies on the provided product related services in 2010 und 2017 by 15 car manufacturers and 20 exemplary automotive brands in Germany revealed that the application degree by the OEM (original equipment manufacturers) in these years increased considerably. While in 2010, the average range of services only amounted to 33%, the value in the automotive industry increased until 2017 to 57%.
Class Phi2 amplifier using GaN HEMTs at 13.56MHz with tuned transformer for wireless power transfer
(2022)
This paper discusses a design procedure of a wireless power transfer system at a RF switching frequency of 13.56MHz. The wireless power transfer amplifier uses GaN HEMTs in aClass phi2 topology and is designed in order to achieve high efficiency and high power density. A design method for the load over a certain bandwidth is presented for a transformer with its tuning network.
The amount of image data has been rising exponentially over the last decades due to numerous trends like social networks, smartphones, automotive, biology, medicine and robotics. Traditionally, file systems are used as storage. Although they are easy to use and can handle large data volumes, they are suboptimal for efficient sequential image processing due to the limitation of data organisation on single images. Database systems and especially column-stores support more stuctured storage and access methods on the raw data level for entiere series.
In this paper we propose definitions of various layouts for an efficient storage of raw image data and metadata in a column store. These schemes are designed to improve the runtime behaviour of image processing operations. We present a tool called column-store Image Processing Toolbox (cIPT) allowing to easily combine the data layouts and operations for different image processing scenarios.
The experimental evaluation of a classification task on a real world image dataset indicates a performance increase of up to 15x on a column store compared to a traditional row-store (PostgreSQL) while the space consumption is reduced 7x. With these results cIPT provides the basis for a future mature database feature.
In today’s education, healthcare, and manufacturing sectors, organizations and information societies are discussing new enhancements to corporate structure and process efficiency using digital platforms. These enhancements can be achieved using digital tools. Industry 5.0 and Society 5.0 give several potentials for businesses to enhance the adaptability and efficacy of their industrial processes, paving the door for developing new business models facilitated by digital platforms. Society 5.0 can contribute to a super-intelligent society that includes the healthcare industry. In the past decade, the Internet of Things, Big Data Analytics, Neural Networks, Deep Learning, and Artificial Intelligence (AI) have revolutionized our approach to various job sectors, from manufacturing and finance to consumer products. AI is developing quickly and efficiently. We have heard of the latest artificial intelligence chatbot, ChatGPT. OpenAI created this, which has taken the internet by storm. We tested the effectiveness of a considerable language model referred to as ChatGPT on four critical questions concerning “Society 5.0”, “Healthcare 5.0”, “Industry,” and “Future Education” from the perspectives of Age 5.0.
Advanced power semiconductors such as DMOS transistors are key components of modern power electronic systems. Recent discrete and integrated DMOS technologies have very low area-specific on-state resistances so that devices with small sizes can be chosen. However, their power dissipation can sometimes be large, for example in fault conditions, causing the device temperature to rise significantly. This can lead to excessive temperatures, reduced lifetime, and possibly even thermal runaway and subsequent destruction. Therefore, it is required to ensure already in the design phase that the temperature always remains in an acceptable range. This paper will show how self-heating in DMOS transistors can be experimentally determined with high accuracy. Further, it will be discussed how numerical electrothermal simulations can be carried out efficiently, allowing the accurate assessment of self-heating within a few minutes. The presented approach has been successfully verified experimentally for device temperatures exceeding 500 ◦C up to the onset of thermal runaway.
Organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Sociotechnical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle complex business and IT architectures. The transformation of an organization’s EA is influenced by big data transformation processes and their data-driven approach on all layers. In this paper, we review big data literature to analyze which requirements for the EA management discipline are proposed. Based on a systematic literature identification, conceptual categories of requirements for EA management are elicited utilizing an inductive category formation. These conceptual categories of requirements constitute a category system that facilitates a new perspective on EA management and fosters the innovation-driven evolution of the EA management.
discipline.
An ongoing challenge in our days is to lower the impact on the quality of life caused by dysfunctionality through individual support. With the background of an aging society and continuous increases in costs for care, a holistic solution is needed. This solution must integrate individual needs and preferences, locally available possibilities, regional conditions, professional and informal caregivers and provide the flexibility to implement future requirements. The proposed model is a result of a common initiative to overcome the major obstacles and to center a solution on individual needs caused by dysfunctionality.
The increasing slew rate of modern power switches can increase the efficiency and reduce the size of power electronic applications. This requires a fast and robust signal transmission to the gate driver of the high-side switch. This work proposes a galvanically isolated capacitive signal transmission circuit to increase common mode transient immunity (CMTI). An additional signal path is introduced to significantly improve the transmission robustness for small duty cycles to assure a safe turn-off of the power switch. To limit the input voltage range at the comparator on the secondary side during fast high-side transitions, a clamping structure is implemented. A comparison between a conventional and the proposed signal transmission is performed using transistor level simulations. A propagation delay of about 2 ns over a wide range of voltage transients of up to 300V/ns at input voltages up to 600V is achieved.
In practice, the use of layout PCells for analog IC design has not advanced beyond primitive devices and simple modules. This paper introduces a Constraint-Administered PCell-Applying Blocklevel Layout Engine (CAPABLE) which permits PCells to access their context, thus enabling a true "bottom-up" development of complex parameterized modules. These modules are integrated into the design flow with design constraints and applied by an execution cockpit via an automatically built layout script. The practical purpose of CAPABLE is to easily generate full-custom block layouts for given schematic circuits. Perspectively, our results inspire a whole new conception of PCells that can not only act (on demand), but also react (to environmental changes) and interact (with each other).
Power loss measurement of power electronic components and overall systems is sometimes difficult by use of electrical quantities and in few applications even not possible. The calorimetric power loss measurement is an established method to identify the overall system losses with a suitable accuracy. This paper presents a novel method with an open chamber calorimeter under accurate air mass flow, air pressure, humidity measurement and temperature control. The benefits are the approximately halved measurement time compared to established systems and the possibility to control the chamber temperature. So it is possible to measure the power losses at different ambient temperatures.
Even though near-data processing (NDP) can provably reduce data transfers and increase performance, current NDP is solely utilized in read-only settings. Slow or tedious to implement synchronization and invalidation mechanisms between host and smart storage make NDP support for data-intensive update operations difficult. In this paper, we introduce a low-latency cache-coherent shared lock table for update NDP settings in disaggregated memory environments. It utilizes the novel CCIX interconnect technology and is integrated in neoDBMS, a near-data processing DBMS for smart storage. Our evaluation indicates end-to-end lock latencies of ∼80-100ns and robust performance under contention.
Continuous monitoring of individual vital parameters can provide information for the assessment of one’s health and indications of medical problems in the context of personalized medicine. Correlations between parameters and health issues are to be evaluated. As one project in this topic area, a telemedicine platform is implemented to gather data of outpatients via wearables and accumulate them for physicians and researchers to review. This work extracts requirements, draws use case scenarios, and shows the current system architecture consisting of a patient application, a physician application with a web server, and a backend server application. In further work, the prototype will assist to develop a vendor-free and open monitoring solution. A conclusion on functionality and usability will be evaluated in an imminent first study.
Non-fungible tokens (NFTs) are unique digital assets that have recently gained significant popularity, particularly in the digital art sector. The success of NFTs and other blockchain-based innovations depends on their ac-acceptance and use by consumers. This study aims to understand the impact of moral values on the acceptance of NFTs. Based on a quantitative survey with over 800 complete responses, the analysis shows that moral aspects of NFTs are indeed important for potential users. However, there is an attitude-behavior gap, as the positive impact of moral values on the intention to use NFTs is not reflected in the actual current usage of NFTs by the respondents. This study contributes to knowledge by providing new empirical data on the acceptance of NFTs and highlighting the role of moral values on the acceptance decision.
In 2013, Royal Philips was two years into a daunting transformation. Following declining financial performance, CEO Frans van Houten aimed to turn the Dutch icon into a "high-performing Company" by 2017. This case study examines the challenges of the business-driven IT transformation at Royal Philips, a diversified technology company. The case discusses three crucial issues. First, the case reflects on Philips’ aim at creating value from combining locally relevant products and services while also leveraging its global scale and scope. Rewarded and unrewarded business complexity is analyzed. Second, the case identifies the need to design and align multiple elements of an enterprise (organizational, cultural, technical) to balance local responsiveness with global scale. Third, the case explains the role of IT (as an asset instead of a liability) in Philips’ transformation and discusses the new IT landscape with its digital platforms, and the new practices to create effective business-IT partnerships.
The energy sector in Germany, as in many other countries, is undergoing a major transformation. To achieve the climate targets, numerous measures to implement smart energy and resource efficiency are necessary. Therefore, energy companies are experiencing increasing pressure from politics and society to transform their business areas in a sustainable manner and implement smart and sustainable business models. Consequently, numerous resources are expected to flow into the development and implementation of new business models. But often these efforts remain unsuccessful in practice. There is a large amount of literature on barriers and drivers of smart and sustainable business models in the energy sector. But what are the factors that companies struggle with most when developing and implementing new business models in practice? To answer this question, the results of a systematic literature review were evaluated by conducting semi-structured interviews with experts of the German energy sector. Six categories of transformation barriers were identified: Organizational, Financial, Legal, Partner-Network, Societal and Technological barriers. To overcome these barriers, recommendations for action and key success factors are outlined by the experts interviewed. The interview study validates key barriers and drivers in terms of their significance in practice in the German energy sector and makes recommendations to advance the smart and sustainable transformation of the energy sector.
In a time of upheaval and digitalization, new business models for companies play an important role. Decentralized power generation and energy efficiency indicators to achieve climate goals and to reduce global warming are currently forcing energy companies to develop new business models. In recent years, many methods of business model development have been introduced to create new business ideas. But what are the obstacles in implementing these business models in the energy sector to develop new business opportunities? And what challenges do companies face in this respect? To answer this question, a systematic literature review was conducted in this paper. As a result, eight categories were identified which summarise the main barriers for the implementation of new business models in the energy domain.
The energy turnaround, digitalization and decreasing revenues forces enterprises in the energy domain to develop new business models. Following a Design Science Research approach, we showed in two action research projects that businesses models in the energy domain result in complex ecosystems with multiple actors. Additionally, we identified that municipal utilities have problems with the systematic development of business models. In order to solve the problem, we captured together with the partners of the enterprises the requirements in a second phase. Further we developed a method which consist of the following components: Method for the creative development of a new business model in form of a Business Model Canvas (BMC). A mapping between the e3Value ontology and the BMC for modelling a business ecosystem. The Business Model Configurator (BMConfig) prototype for modelling and simulating the e3Value-Ontology. The Business model can be quantified and analyzed for its viability. We demonstrate the feasibility of our approach in business model of a power community.
This paper investigates the impact of dynamic capabilities (DC) on brand love. From a resource-based view, there is little clarity vis-à-vis the specific capabilities that drive the ability to create brand love. This paper focuses on three research questions: Firstly, which dynamic capabilities are relevant for brand love? Secondly, how strong is the impact of certain dynamic capabilities on brand love? Thirdly, which conditions mediate and moderate the impact of specific dynamic capabilities on brand love? Data from a multi-method research approach have been used to itentify the specific capabilities that corporations need, to enhance brand love. Furthermore, a standardized online survey was conducted on marketing executives and evaluated by structural equation modeling. The results indicate, that customer expertise plays a major role in the relationship between dynamic capabilities and brand love. Furthermore, this relationship is more important in markets that have a low competitive differentiation in products and services.
As "the most international company on earth", DHL Express promised to deliver packages between almost any pair of countries within a defined time-frame. To fulfill this promise, the company had introduced a set of global business and technology standards. While standardization had many advantages (improving service for multinational customers, faster response to changes in import/export regulations, sharing of best practices etc.), it created impediments to local innovation and responsiveness in DHL Express' network of 220 countries/territories. Reconciling standardization-innovation tradeoffs is a critical management issue for global companies in the digital economy.
This case describes one large, successful company's approach to the tradeoff of standardization versus innovation.
Enterprise Architecture (EA) management is an activity that seeks to foster the alignment of business and IT, and pursues various goals further operationalizing this alignment. Key to effective EA management is a framework that defines the roles, activities, and viewpoints used for EA management in accordance to the concerns that the stakeholders aim to address. Consensus holds that such frameworks are organization-specific and hence they are designed in governance activities for EA management. As of today, top-down approaches for governance are used to derive organization-specific frameworks. These usually lack systematic mechanisms for improving the framework based on the feedback of the responsible stakeholders. We outline a bottom-up approach for EA management governance that systematically observes the behavior of the actors to learn user concerns and recommend appropriate viewpoints. With this approach, we complement traditional top-down governance activities.
Bootstrap circuits are mainly used for supplying a gate driver circuit to provide the gate overdrive voltage for a high-side NMOS transistor. The required charge has to be provided by a bootstrap capacitor which is often too large for integration if an acceptable voltage dip at the capacitor has to be guaranteed. Three options of an area efficient bootstrap circuit for a high side driver with an output stage of two NMOS transistors are proposed. The key idea is that the main bootstrap capacitor is supported by a second bootstrap capacitor, which is charged to a higher voltage and connected when the gate driver turns on. A high voltage swing at the second capacitor leads to a high charge allocation. Both bootstrap capacitors require up to 70% less area compared to a conventional bootstrap circuit. This enables compact power management systems with fewer discrete components and smaller die size. A calculation guideline for optimum bootstrap capacitor sizing is given. The circuit was manufactured in a 180nm high-voltage BiCMOS technology as part of a high-voltage gate driver. Measurements confirm the benefit of high-voltage charge storing. The fully integrated bootstrap circuit including two stacked 75.8pF and 18.9pF capacitors results in a voltage dip lower than 1V. This matches well with the theory of the calculation guideline.
The main challenge when driving heat pumps by PV-electricity is balancing differing electrical and thermal demands. In this article, a heuristic method for optimal operation of a heat pump driven by a maximum share of PV-electricity is presented. For this purpose, the (DHW) are activated in order shift the operation of the heat pump to times of PV-generation. The system under consideration refers to thermal and electrical demands of a single family house. It consists of a heat pump, a thermal energy storage for DHW and of grid connected heating and generation of domestic hot water, the heat pump runs with two different supply temperatures and thereby achieving a maximum overall COP. Within the algorithm for optimization a set of heuristic rules is developed in a way that the operational characteristics of the heat pump in terms of minimum running and stopping times are met as well as the limiting constraints of upper and lower limits of room temperature and energy content of electricity generated, a varying number of heat pump schedules fulfilling the bundary conditions are created. Finally, the schedule offering the maximum on-site utilization of PV-electricity with a minimum number of starts of the heat pump, which serves as secondary condition, is selected. Yearly simulations of this combination have been carried out. Initial results of this method indicate a significant rise in on-site consumption of the PV-electricity and heating demand fulfilment by renewable electricity with no need for a massive TES for the heating system in terms of a big water tank.
Size and cost of a boost converter can be minimized by reducing the voltage overshoot and fastening the transient response in case of load transient. The presented technique improves the transient response of a current mode controlled boost converter, which usually suffers from bandwidth limitation because of its right-half-plane zero (RHPZ). The proposed technique comprises a load current estimation which works as part of a digital controller without any additional measurements. Based on the latest load estimation the controller parameters are adapted, achieving small voltage overshoot and fast transient response. The presented technique was implemented in a digital control circuit, consisting of an ASIC in a 110 nm-technology, a Xilinx Spartan-6 field programmable gate array (FPGA), and a TI-ADS8422 analog to-digital-converter (ADC). Simulation and measurements of a 4V-to-6.3V, 500mA boost converter show an improvement of 50% in voltage overshoot and response time to load transient.
This paper contributes to the automatic detection of perioperative workflow by developing a binary endoscope localization. Automated situation recognition in the context of an intelligent operating room requires the automatic conversion of low level cues into more abstract high level information. Imagery from a laparoscope delivers rich content that is easy to obtain but hard to process. We introduce a system which detects if the endoscope's distal tip is inside or outsiede the patient based on the endoscope video. This information can be used as one parameter in a situation recognition pipeline. Our localization performs in real-time at a video resolution of 1280x720 and 5-fold cross validation yields mean F1-scores of up to 0,94 on videos of 7 laparoscopies.
Job advertisements are important means of communicating role expectations for management accountants to the labor market. They provide information about which roles of management accountants are sought by companies or which roles are expected. However, which roles are communicated in job advertisements is unknown so far. Using a large sample of 889 job ads and a text-mining approach, we show an apparent mix of different role types with a strong focus on a rather classic role: the watchdog role. However, individuals with business partner characteristics are more often sought for leadership positions or in family businesses and small and medium-sized enterprises (SMEs). The results challenge the current role discussion for management accountants as business partners in practice and some academic fields.
Excellence in IT is a key enabler for the digital transformation of enterprises. To realize the vision of digital enterprises it is necessary to cope with changing business requirements and to align business and IT. In order to evaluate the contribution of enterprise architecture management to these goals, our paper explores the impact of various factors to the perceived benefit of EAM in enterprises. Based on literature, we build an empirical research model. It is tested with empirical data of European EAM experts using a structural equation modelling approach. It is shown that changing business requirements, IT business alignment, the complexity of information technology infrastructure as well as enterprise architecture knowledge of information technology employees are crucial impact factors to the perceived benefit of EAM in enterprises.
It is assumed that more education leads to better understanding of complex systems. Some researchers claim, however, find indications that simple mechanisms like stocks and flows are not well understood even by people who have passed higher education. In this paper, we test people’s understanding of complex systems with the widely studied stock-and-flow (SF) tasks (Booth Sweeney and Sterman 2000). SF tasks assess people’s understanding of the interplay between stocks and flows. We investigate SF failure of domain experts and novices in different knowledge domains. In particular, we compare performance on the original study’s Bathtub task with the square wave pattern (Booth Sweeney and Sterman 2000) with two alternative cover stories from the engineering and business domains on different groups of business and engineering students from different semesters. Further, we show that, while engineering students perform better than business students, with progressing in higher education, students seem to lose the capability of dealing with simple SF tasks from domains other than their field. We thus find hints on déformation professionelle in higher education.
SF-failure, the inability of people to correctly determine the behavior of simple stock and flow structures is subject of a long research stream. Reasons for SF-failure can be attributed to different reasons, one of them being lacking domain specific experience, thus familiarity with the problem context. In this article we present a continuation of an experiment to examine the role of educational background in SF-performance. We base the question set on the Bathtub Dynamics tasks introduced by Booth Sweeney and Sterman (2000) and vary the cover stories. In this paper we describe how we developed and tested a new cover story for the engineering domain and implemented the recommendations from a prior study. We test three sets of questions with engineering students which enables us to compare the results to a previous study in which we tested the questions with business students. Results mainly support our hypothesis that context familiarity increases SF-performance. With our findings we further develop the methodology of the research on SF-failure.
Prior studies ascribed people’s poor performance in dealing with basic systems concepts to different causes. While results indicate that, among other things, domain specific experience and familiarity with the problem context play a role in this stock-flow-(SF-)performance, this has not yet been fully clarified. In this article, we present an experiment that examines the role of educational background in SF-performance. We hypothesize that SF-performance increases when the problem context is embedded in the problem solver’s knowledge domain, indicated by educational background. Using the square wave pattern and the sawtooth pattern tasks from the initial study by Booth Sweeney and Sterman (2000), we design two additional cover stories for the former, the Vehicle story from the engineering domain and the Application story from the business domain, next to the original Bathtub story. We then test the three sets of questions on business students. Results mainly support our hypothesis. Interestingly, participants even do better on a more complex behavioral pattern from their knowledge domain than on a simpler pattern from more distant domains. Although these findings have to be confirmed by further studies, they contribute both to the methodology of future surveys and the context familiarity discussion.
Virtual Reality (VR) technology has the potential to support knowledge communication in several sectors. Still, when educators make use of immersive VR technology in favor of presenting their knowledge, their audience within the same room may not be able to see them anymore due to wearing head-mounted displays (HMDs). In this paper, we propose the Avatar2Avatar system and design, which augments the visual aspect during such a knowledge presentation. Avatar2Avatar enables users to see both a realistic representation of their respective counterpart and the virtual environment at the same time. We point out several design aspects of such a system and address design challenges and possibilities that arose during implementation. We specifically explore opportunities of a system design for integrating 2D video-avatars in existing roomscale VR setups. An additional user study indicates a positive impact concerning spatial presence when using Avatar2Avatar.
The EU funded project RobLog recently developed a system able to autonomously unload coffee sacks from a standard container. Being the first of its kind, a further development is needed in order for the system to be competitive against manual labor. Financing this development entails a risk, hence a justified skepticism, which can be overcome by the longsighted view of the existing market potential. This paper presents a method to estimate the market potential of autonomous unloading systems for heavy deformable goods. Starting from the analysis of the coffee trade, first the current coffee traffic is investigated in order to calculate the number of autonomous systems needed to handle the imported sacks; Results are validated and the method is extended for the calculation of the potential of other market segments, where the same unloading technology can be applied.
According to several surveys and statistics, the great majority of companies previously not accustomed to automation are piloting solutions to automate business processes. Those accustomed to automation also attempt to introduce more of it, focusing on automation-unfriendly processes that remained manual. However, when the decision on what and whether to automate is not trivial for evident reasons, even industry leaders may get stuck on an overwhelming question: where to begin automating? The question remains too often unanswered as state-of-the-art methods fail to consider the whole picture. This paper introduces a holistic approach to the decision-making for investments in automation. The method supports the iterative analysis and evaluation of operative processes, providing tools for a quantitative approach to the decision-making. Thanks to the method, a large pool of processes can be first considered and then filtered out in order to select the one that yields the best value for the automation in the specific context. After introducing the method, a case study is reported for validation before the discussion.
The high system flexibility necessary for the full automation of complex and unstructured tasks leads to increased technological complexity, thus to higher costs and lower performance. In this paper, after an introduction to the different dimensions of flexibility, a method for flexible modular configuration and evaluation of systems of systems is introduced. The method starts from process requirements and, considering factors such as feasibility, development costs, market potential and effective impact on the current processes, enables the evaluation of a flexible systems of systems equipped with the needed functionalities before its actual development. This allows setting the focus on those aspects of flexibility that add market value to the system, thus promoting the efficient development of systems addressed to interested customers in intralogistics. An example of application of the method is given and discussed.
Physical analog IC design has not been automated to the same degree as digital IC design. This shortfall is primarily rooted in the analog IC design problem itself, which is considerably more complex even for small problem sizes. Significant progress has been made in analog automation in several R&D target areas in recent years. Constraint engineering and generator-based module approaches are among the innovations that have emerged. Our paper will first present a brief review of the state of the art of analog layout automation. We will then introduce active and open research areas and present two visions – a “continuous layout design flow” and a “bottom-up meets top-down design flow” – which could significantly push analog design automation towards its goal of analog synthesis.
In a time of digital transformation, the ability to quickly and efficiently adapt software systems to changed business requirements becomes more important than ever. Measuring the maintainability of software is therefore crucial for the long-term management of such products. With service-based systems (SBSs) being a very important form of enterprise software, we present a holistic overview of such metrics specifically designed for this type of system, since traditional metrics – e.g. object oriented ones – are not fully applicable in this case. The selected metric candidates from the literature review were mapped to 4 dominant design properties: size, complexity, coupling, and cohesion. Microservice-based systems (μSBSs) emerge as an agile and fine grained variant of SBSs. While the majority of identified metrics are also applicable to this specialization (with some limitations), the large number of services in combination with technological heterogeneity and decentralization of control significantly impacts automatic metric collection in such a system. Our research therefore suggests that specialized tool support is required to guarantee the practical applicability of the presented metrics to μSBSs.
DMOS transistors often suffer from substantial self-heating during high power dissipation, which can lead to thermal destruction if the device temperature reaches excessive values. A successfully demonstrated method to reduce the peak temperature is the redistribution of power dissipation density from the hotter to the cooler device areas by careful layout modification. However, this is very tedious and time-consuming if complex-shaped devices as often found in industrial applications are considered.
This paper presents an approach for fully automatic layout optimization which requires only a few hours processing time. The approach is applied to complex shaped test structures which are investigated by measurements and electro-thermal simulations. Results show a significantly lower peak temperature and an energy capability gain of 84 %, offering potential for a 18 % size reduction of active area.
Switched reluctance motors are particularly attractive due to their simple structure. The control of this machine type requires the instants, to switch the currents in the motor phases in an appropriate sequence. These switching instants are determined either based on a position sensor, or on signals generated by a sensorless method. A very simple sensorless method uses the switching frequency of the hysteresis controllers used for phase current control. This paper first presents an automatic commissioning method for this sensorless method and second a startup procedure, thus enhancing this approach towards an application in industry.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
The state of the art proposes the microservices architectural style to build applications. Additionally, container virtualization and container management systems evolved into the perfect fit for developing, deploying, and operating microservices in line with the DevOps paradigm. Container virtualization facilitates deployment by ensuring independence from the runtime environment. However, microservices store their configuration in the environment. Therefore, software developers have to wire their microservice implementation with technologies provided by the target runtime environment such as configuration stores and service registries. These technological dependencies counteract the portability benefit of using container virtualization. In this paper, we present AUTOGENIC - a model-based approach to assist software developers in building microservices as self configuring containers without being bound to operational technologies. We provide developers with a simple configuration model to specify configuration operations of containers and automatically generate a self-configuring microservice tailored for the targeted runtime environment. Our approach is supported by a method, which describes the steps to automate the generation of self-configuring microservices. Additionally, we present and evaluate a prototype, which leverages the emerging TOSCA standard.
While Microservices promise several beneficial characteristics for sustainable long-term software evolution, little empirical research covers what concrete activities industry applies for the evolvability assurance of Microservices and how technical debt is handled in such systems. Since insights into the current state of practice are very important for researchers, we performed a qualitative interview study to explore applied evolvability assurance processes, the usage of tools, metrics, and patterns, as well as participants’ reflections on the topic. In 17 semi-structured interviews, we discussed 14 different Microservice-based systems with software professionals from 10 companies and how the sustainable evolution of these systems was ensured. Interview transcripts were analyzed with a detailed coding system and the constant comparison method.
We found that especially systems for external customers relied on central governance for the assurance. Participants saw guidelines like architectural principles as important to ensure a base consistency for evolvability. Interviewees also valued manual activities like code review, even though automation and tool support was described as very important. Source code quality was the primary target for the usage of tools and metrics. Despite most reported issues being related to Architectural Technical Debt (ATD), our participants did not apply any architectural or service-oriented tools and metrics. While participants generally saw their Microservices as evolvable, service cutting and finding an appropriate service granularity with low coupling and high cohesion were reported as challenging. Future Microservices research in the areas of evolution and technical debt should take these findings and industry sentiments into account.
Physicians in interventional radiology are exposed to high physical stress. To avoid negative long-term effects resulting from unergonomic working conditions, we demonstrated the feasibility of a system that gives feedback about unergonomic
situations arising during the intervention based on the Azure Kinect camera. The overall feasibility of the approach could be shown.
The promise of the EVs is twofold. First, rejuvenating a transport sector that still heavily depends on fossil fuels and second, integrating intermittent renewable energies into the power mix. However, it is still not clear how electricity networks will cope with the predicted increase in EVs and their charging demand, especially in combination with conventional energy demand. This paper proposes a methodology which allows to predict the impact of EV charging behavior on the electricity grid. Moreover, this model simulates the driving and charging behavior of heterogeneous EV drivers which differ in their mobility pattern, decision-making heuristics and charging strategies. The simulations show that uncoordinated charging results in charging load clustering. In contrast, decentralized coordination allows to fill the valleys of the conventional load curve and to integrate EVs without the need of a costly expansion of the electricity grid.
New storage technologies, such as Flash and Non- Volatile Memories, with fundamentally different properties are appearing. Leveraging their performance and endurance requires a redesign of existing architecture and algorithms in modern high performance databases. Multi-Version Concurrency Control (MVCC) approaches in database systems, maintain multiple timestamped versions of a tuple. Once a transaction reads a tuple the database system tracks and returns the respective version eliminating lock-requests. Hence under MVCC reads are never blocked, which leverages well the excellent read performance (high throughput, low latency) of new storage technologies. Upon tuple updates, however, established implementations of MVCC approaches (such as Snapshot Isolation) lead to multiple random writes – caused by (i) creation of the new and (ii) in-place invalidation of the old version – thus generating suboptimal access patterns for the new storage media. The combination of an append based storage manager operating with tuple granularity and snapshot isolation addresses asymmetry and in-place updates. In this paper, we highlight novel aspects of log-based storage, in multi-version database systems on new storage media. We claim that multi-versioning and append-based storage can be used to effectively address asymmetry and endurance. We identify multi-versioning as the approach to address dataplacement in complex memory hierarchies. We focus on: version handling, (physical) version placement, compression and collocation of tuple versions on Flash storage and in complex memory hierarchies. We identify possible read- and cacherelated optimizations.
Painting galleries typically provide a wealth of data composed of several data types. Those multivariate data are too complex for laymen like museum visitors to first, get an overview about all paintings and to look for specific categories. Finally, the goal is to guide the visitor to a specific painting that he wishes to have a more closer look on. In this paper we describe an interactive visualization tool that first provides such an overview and lets people experiment with the more than 41,000 paintings collected in the web gallery of art. To generate such an interactive tool, our technique is composed of different steps like data handling, algorithmic transformations, visualizations, interactions, and the human user working with the tool with the goal to detect insights in the provided data. We illustrate the usefulness of the visualization tool by applying it to such characteristic data and show how one can get from an overview about all paintings to specific paintings.
Demand forecasting intermittent time series is a challenging business problem. Companies have difficulties in forecasting this particular form of demand pattern. On the one hand, it is characterized by many non-demand periods and therefore classical statistical forecasting algorithms, such as ARIMA, only work to a limited extent. On the other hand, companies often cannot meet the requirements for good forecasting models, such as providing sufficient training data. The recent major advances of artificial intelligence in applications are largely based on transfer learning. In this paper, we investigate whether this method, originating from computer vision, can improve the forecasting quality of intermittent demand time series using deep learning models. Our empirical results show that, in total, transfer learning can reduce the mean square error by 65 percent. We also show that especially short (65 percent reduction) and medium long (91 percent reduction) time series benefit from this approach.
Presently, many companies are transforming their strategy and product base, as well as their culture, processes and information systems to become more digital or to approach for a digital leadership. In the last years new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, edge and fog computing, social networks, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT environments with many rather small and distributed structures, like the Internet of Things, Microservices, or other micro-granular elements. This has a strong impact for architecting digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of micro-granular system architectures defines the moving context for adaptable systems. We are focusing on a continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, as part of a new digital enterprise architecture for service dominant digital products.
The current advancement of Artificial Intelligence (AI) combined with other digitalization efforts significantly impacts service ecosystems. Artificial intelligence has a substantial impact on new opportunities for the co-creation of value and the development of intelligent service ecosystems. Motivated by experiences and observations from digitalization projects, this paper presents new methodological perspectives and experiences from academia and practice on architecting intelligent service ecosystems and explores the impact of artificial intelligence through real cases supporting an ongoing validation. Digital enterprise architecture models serve as an integral representation of business, information, and technological perspectives of intelligent service-based enterprise systems to support management and development. This paper focuses on architectural models for intelligent service ecosystems, showing the fundamental business mechanism of AI-based value co-creation, the corresponding digital architecture, and management models. The focus of this paper presents the key architectural model perspectives for the development of intelligent service ecosystems.
Our paper gives first answers on a fundamental question: how can the design of architectures of intelligent digital systems and services be accomplished methodologically? Intelligent systems and services are the goals of many current digitalization efforts today and part of massive digital transformation efforts based on digital technologies. Digital systems and services are the foundation of digital platforms and ecosystems. Digtalization disrupts existing businesses, technologies, and economies and promotes the architecture of open environments. This has a strong impact on new value-added opportunities and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, and social enterprise networks systems are important enablers of digitalization. The current publication presents our research on the architecture of intelligent digital ecosystems and products and services influenced by the service-dominant logic. We present original methodological extensions and a new reference model for digital architectures with an integral service and value perspective to model intelligent systems and services that effectively align digital strategies and architectures with artificial intelligence as main elements to support intelligent digitalization.
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
Patterns are virtually simulated in 3D CAD programs before production to check the fit. However, achieving lifelike representations of human avatars, especially regarding soft tissue dynamics, remains challenging. This is mainly since conventional avatars in garment CAD programs are simulated with a continuous hard surface and not corresponding to the human physical and mechanical body properties of soft tissue. In the real world, the human body’s natural shape is affected by the contact pressure of tight-fitting textiles. To verify the fit of a simulated garment, the interactions between the individual body shape and the garment must be considered. This paper introduces an innovative approach to digitising the softness of human tissue using 4D scanning technology. The primary objective of this research is to explore the interactions between tissue softness and different compression levels of apparel, exerting pressure on the tissue to capture the changes in the natural shape. Therefore, to generate data and model an avatar with soft body physics, it is essential to capture the deform ability and elasticity of the soft tissue and map it into the modification options for a simulation. To aim this, various methods from different fields were researched and compared to evaluate 4D scanning as the most suitable method for capturing tissue deformability in vivo. In particular, it should be considered that the human body has different deformation capabilities depending on age, the amount of muscle and body fat. In addition, different tissue zones have different mechanical properties, so it is essential to identify and classify them to back up these properties for the simulation. It has been shown that by digitising the obtained data of the different defined applied pressure levels, a prediction of the deformation of the tissue of the exact person becomes possible. As technology advances and data sets grow, this approach has the potential to reshape how we verify fit digitally with soft avatars and leverage their realistic soft tissue properties for various practical purposes.
Due to Industry 4.0, the full value creation has the chance to undergo a fundamental technological transformation, the realisation of which, however, requires the commitment of every company for its own benefit. The new approaches of Industry 4.0 are often hardly evaluated, let alone proven, so that SMEs in particular often cannot properly estimate the potentials and risks, and often waiting too long with the migration towards Industry 4.0. In addition, they often do not pursue an integrated concept in order to identify possible potentials through changes in their business models. . As part of the research project "GEN-I 4.0 – Geschäftsmodell-Entwicklung für die Industrie 4.0” ", the ESB Business School at Reutlingen University of Applied Sciences and the Fraunhofer Institute for Industrial Engineering and Organization FHG IAO were engaged by the Baden-Württemberg Foundation from 2016 to 2018 to develop tools and an approach how the local economy can develop digital business models for itself in a methodical, beneficial and targeted manner. Through international analyses and interviews GEN-I 4.0 gained and concretized the knowledge required for the evaluation and selection of solutions and approaches for the transfer to develop digital business models. Together with the know-how of the project partners on Industry 4.0 and business model development, the findings were incorporated into the development of two software tools with which SMEs are shown the potentials of Industry 4.0 for their individual business model, online and in selfassessment, and given a comprehensive structured, concrete approach to development, as well as their individual risk. Users of the tools are supported by the selected platform for the networking of different players to implement innovative business models accompanied by coaching concepts for the companies in the follow-up and implementation of the assessment results.