Refine
Document Type
- Journal article (1240)
- Conference proceeding (1035)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (37)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3079)
Institute
- ESB Business School (1103)
- Informatik (872)
- Technik (508)
- Life Sciences (343)
- Texoversum (219)
- Zentrale Einrichtungen (16)
Publisher
- Springer (346)
- IEEE (250)
- Elsevier (219)
- Hochschule Reutlingen (186)
- MDPI (98)
- Springer Gabler (79)
- Gesellschaft für Informatik (66)
- Universitätsbibliothek Tübingen (59)
- Wiley (54)
- ACM (40)
Model-guided Therapy and Surgical Workflow Systems are two interrelated research fields, which have been developed separately in the last years. To make full use of both technologies, it is necessary to integrate them and connect them to Hospital Information Systems. We propose a framework for integration of Model-guided Therapy in Hospital Information Systems based on the Electronic Medical Record, and a taskbased Workflow Management System, which is suitable for clinical end users. Two prototypes - one based on Business Process Modeling Language, one based on the serum-board - are presented. From the experience with these prototypes, we developed a novel personalized visualization system for Surgical Workflows and Model-guided Therapy. Key challenges for further development are automated situation detection and a common communication infrastructure.
This paper presents an approach for the implementation of a modular and scalable power electronics device for controlling electric drives in the field of electric vehicles using wide bandgap semiconductor devices. The main idea is to achieve the required output currents or voltages by connecting adequately designed hardware modules in parallel or in series. This particular design is based on the fact that the single modules generate a continuous and specified output voltage from a given dc voltage, e.g. an intermediate circuit or battery voltage. The main benefit is, that different current or voltage requirements can be satisfied based on a single module thus decreasing development and production costs. The current paper focuses on the connection in parallel of such modules. A control architecture is illustrated and a first proof of concept is given.
While several service-based maintainability metrics have been proposed in the scientific literature, reliable approaches to automatically collect these metrics are lacking. Since static analysis is complicated for decentralized and technologically diverse microservice-based systems, we propose a dynamic approach to calculate such metrics from runtime data via distributed tracing. The approach focuses on simplicity, extensibility, and broad applicability. As a first prototype, we implemented a Java application with a Zipkin integrator, 23 different metrics, and five export formats. We demonstrated the feasibility of the approach by analyzing the runtime data of an example microservice based system. During an exploratory study with six participants, 14 of the 18 services were invoked via the system’s web interface. For these services, all metrics were calculated correctly from the generated traces.
This paper studies whether a monetary union needs a fical union in particular in the Eurozone. On 1 January 1999, despite controversial debates, the rule-based Economic and Monetary Union (EMU) started without a fiscal union. I show that there is weak economic convergence in the EMU since 18 years. In addition, I argue that a fiscal union does not solve the past disintegration failures.
I demonstrate that the major flaws are domestic policy failures and not institutional failures in the euro area. Consequently, establishing a monetary union without having a political union is a risky strategy. Indeed, the rule-based architecture of Maastricht is not guilty for the crisis alone. The root causes are the political flaws aligned with the rather weak enforcement of the rules. I propose a genuine redesign of the rule-based paradigm without a fiscal union. Yet a monetary union without a fiscal union works effectively if the rule enforcement is more automatic and independent of domestic and European policy-making.
Heat pumps are a vital element for reaching the greenhouse gas (GHG) reduction targets in the heating sector, but their system integration requires smart control approaches. In this paper, we first offer a comprehensive literature review and definition of the term control for the described context. Additionally, we present a control approach, which consists of an optimal scheduling module coupled with a detailed energy system simulation module. The aim of this integrated two part control approach is to improve the performance of an energy system equipped with a heat pump, while recognizing the technical boundaries of the energy system in full detail. By applying this control to a typical family household situation, we illustrate that this integrated approach results in a more realistic heat pump operation and thus a more realistic assessment of the control performance, while still achieving lower operational costs.
In recent years, the numer of hybrid work systems using human robot collaboration (HRC) increased in industrial production environments - enhancing productivity while reducing work-related burden. Despite growing availability of HRC-suitable manipulation and safety technology, tools and techniques facilitating the design, planning and implementation process are still lacking. System engineers who strive to implement technically feasible, ergonomically meaningful and economically beneficial HRC application need to make design and technology decisions in various subject areas, whereas the design alternatives per morphological analysis is applied to establish a description model that can serve as both a supporting design guideline for future HRC application of value-adding, industrial quality as well as a tool to characterize and compare existing applications. It focuses on HRC within assembly processes, and illustrates the complexity of HRC applications in a comprehensible manner through its multi-dimensional structure. The morphology has been validated through its application on various existing industrial HRC applications, research demonstrators and interviews of experts from academia.
3D morphable face models are a powerful tool in computer vision. They consist of a PCA model of face shape and colour information and allow to reconstruct a 3D face from a single 2D image. 3D morphable face models are used for 3D head pose estimation, face analysis, face recognition, and, more recently, facial landmark detection and tracking. However, they are not as widely used as 2D methods - the process of building and using a 3D model is much more involved.
In this paper, we present the Surrey Face Model, a multi resolution 3D morphable model that we make available to the public for non-commercial purposes. The model contains different mesh resolution levels and landmark point annotations as well as metadata for texture remapping. Accompanying the model is a lightweight open-source C++ library designed with simplicity and ease of integration as its foremost goals. In addition to basic functionality, it contains pose estimation and face frontalisation algorithms. With the tools presented in this paper, we aim to close two gaps. First, by offering different model resolution levels and fast fitting functionality, we enable the use of a 3D Morphable Model in time-critical applications like tracking. Second, the software library makes it easy for the community to adopt the 3D morphable face model in their research, and it offers a public place for collaboration.
Purpose: This paper aims to conceptualize and empirically test the determinants of service interaction quality (SIQ) as attitude, behavior and expertise of a service provider (SP). Further, the individual and simultaneous effects of SIQ and its dimensions on important marketing outcomes are tested. Design/methodology/approach – The narrative review of extant research helps formulate a conceptual model of SIQ, which is investigated using the univariate and multivariate meta-analysis.
Findings: There are interdependencies between drivers of SIQ that underlines the need to conceptualize service interaction as a dyadic phenomenon; use contemporary multilevel models, dyadic models, non-linear structural equation modeling and process studies; and study new and diverse services contexts. Meta-analysis illustrates the relative importance of the three drivers of SIQ and, in turn, their impact on consumer satisfaction and loyalty.
Research limitations/implications – The meta-analysis is based on existing research, which, unfortunately, has not examined critical services or exigency situations where SIQ is of paramount importance. Future research will be tasked with diversifying to several important domains where SIQ is a critical aspect of perceived service quality.
Practical implications: This study emphasizes that, although the expertise of an SP is important, firms would be surprised to learn that the attitude and behavior of their employees are equally important antecedents. In fact, there is a delicate balance that needs to be found; otherwise, attitudinal factors can have an overall counterproductive effect on consumer satisfaction.
Originality/value: This paper provides an empirical synthesis of SIQ and opens up interesting areas for further research.
This paper presents a new broadband antenna for satellite communications. It describes the procedure involved in the design of a microstrip antenna array and its multi-level passive feed network that together yield circular polarization and the necessary gain to be used in an earth-satellite link. The designed antenna is notable for its large bandwidth, circular polarization, high gain and small dimensions.
A new planar compact antenna composed of two crossed Cornu spirals is presented. Each Cornu spiral is fed from the center of the linearly part of the curvature between the two spirals, which builds the clothoid. Sequential rotation is applied using a sequential phase network to obtain circular polarization and increase the effective bandwidth. Signal integrity issues have been addressed and designed to ensure high quality of signal propagation. As a result, the antenna shows good radiation characteristics in the bandwidth of interest. Compared to antennas of the same size in the literature, it is broadband and of high gain. Although the proposed antenna has been designed for K- and Ka-band operations, it can also be developed for lower and upper frequencies because of the linearity of the Maxwell equations.
A new method for the analysis of movement dependent parasitics in full custom designed MEMS sensors
(2017)
Due to the lack of sophisticated microelectromechanical systems (MEMS) component libraries, highly optimized MEMS sensors are currently designed using a polygon driven design flow. The strength of this design flow is the accurate mechanical simulation of the polygons by finite element (FE) modal analysis. The result of the FE-modal analysis is included in the system model together with the data of the (mechanical) static electrostatic analysis. However, the system model lacks the dynamic parasitic electrostatic effects, arising from the electric coupling between the wiring and the moving structures. In order to include these effects in the system model, we present a method which enables the quasi dynamic parasitic extraction with respect to in-plane movements of the sensor structures. The method is embedded in the polygon driven MEMS design flow using standard EDA tools. In order to take the influences of the fabrication process into account, such as etching process variations, the method combines the FE-modal analysis and the fabrication process simulation data. This enables the analysis of dynamic changing electrostatic parasitic effects with respect to movements of the mechanical structures. Additionally, the result can be included into the system model allowing the simulation of positive feedback of the electrostatic parasitic effects to the mechanical structures.
In this note we look at anisotropic approximation of smooth functions on bounded domains with tensor product splines. The main idea is to extend such functions and then use known approximation techniques on Rd. We prove an error estimate for domains for which bounded extension operators exist. This obvious approach has some limitations. It is not applicable without restrictions on the chosen coordinate degree even if the domain is as simple as the unit disk. Further for approximation on Rd there are error estimates in which the grid widths and directional derivatives are paired in an interesting way. It seems impossible to maintain this property using extension operators.
This paper introduces a novel placement methodology for a common-centroid (CC) pattern generator. It can be applied to various integrated circuit (IC) elements, such as transistors, capacitors, diodes, and resistors. The proposed method consists of a constructive algorithm which generates an initial, close to the optimum, solution, and an iterative algorithm which is used subsequently, if the output of constructive algorithm does not satisfy the desired criteria. The outcome of this work is an automatic CC placement algorithm for IC element arrays. Additionally, the paper presents a method for the CC arrangement evaluation. It allows for evaluating the quality of an array, and a comparison of different placement methods.
This article presents a modified method of performing power flow calculations as an alternative to pure energy-based simulations of off-grid hybrid systems. The enhancement consists in transforming the scenario-based power flow method into a discrete time-dependent algorithm with the inclusion of bus and controller dynamics.
Continuous manufacturing is becoming more important in the biopharmaceutical industry. This processing strategy is favorable, as it is more efficient, flexible, and has the potential to produce higher and more consistent product quality. At the same time, it faces some challenges, especially in cell culture. As a steady state has to be maintained over a prolonged time, it is unavoidable to implement advanced process analytical technologies to control the relevant process parameters in a fast and precise manner. One such analytical technology is Raman spectroscopy, which has proven its advantages for process monitoring and control mostly in (fed-) batch cultivations. In this study, an in-line flow cell for Raman spectroscopy is included in the cell-free harvest stream of a perfusion process. Quantitative models for glucose and lactate were generated based on five cultivations originating from varying bioreactor scales. After successfully validating the glucose model (Root Mean Square Error of Prediction (RMSEP) of ∼0.2 g/L), it was employed for control of an external glucose feed in cultivation with a glucose-free perfusion medium. The generated model was successfully applied to perform process control at 4 g/L and 1.5 g/L glucose over several days, respectively, with variability of ±0.4 g/L. The results demonstrate the high potential of Raman spectroscopy for advanced process monitoring and control of a perfusion process with a bioreactor and scale-independent measurement method.
A novel brushless excitation concept for synchronous machines with a rotating power converter is proposed in this paper. The concept does not need an auxiliary winding or any other modification to the machine structure apart from an inverter with a DC link capacitor and a controller on the rotor. The power required for the rotor excitation is provided by injecting harmonics into the stator winding. Thus, a voltage in the field coil is induced. The rotor inverter is controlled such that the alternating current charges the DC link capacitor. At the same time the inverter supplies the DC field current to the field coil. The excitation concept is first developed in theory, then presented using an analytical model and FEA, and lastly investigated with a prelimininary experimental setup.
A novel configuration of the dual active bridge (DAB) DC/DC converter is presented, enabling more efficient wide voltage range conversion at light loads. A third phase leg as well as a center tapped transformer are introduced to one side of the converter. This concept provides two different turn ratios, thus extending the zero voltage switching operation resulting in higher efficiency. A laboratory prototype was built converting an input voltage of 40V to an output voltage in the range of 350V to 650V. Measurements show a significant increase up to 20% in the efficiency for light-load operation.
This paper presents a novel emulation concept for the test of smart contracts and Distributed Ledger Technologies (DLT) in distribute control or energy economy tasks and use cases. The concept uses state of the art behavioral modeling tools such as Matlab Simulink but presents a possible way to solve the shortfall of Simulink in communicating to DLT-Nodes directly. This is solved through a middleware solution. After this, an example used in verifying the test bed is presented and the target demonstration object is described. Finally, the possible expansion of the system is discussed and presented.
Many GaN power transistors contain a PN junction between gate and the channel region close to the source. In order to maintain the on-state, current must continuously be supplied to the junction. Therefore, the commonly recommended approach uses a gate bias voltage of 12V to compensate the Miller current through a boost circuit. For the same purpose, a novel gate driving method based on an inductive feed forward has been presented. With this, stable turn-on can be achieved even for a bias voltage of only 5V. The effectiveness of this concept is demonstrated by double pulse measurements, switching currents up to 27A and a voltage of 400V. For both approaches a compact design with low source inductance is characterized. In addition to the significant reduction of the gate bias voltage and peak gate current, the new approach reduces the switching losses for load currents >23 A.
Modern power semiconductor devices have low capacitances and can therefore achieve very fast switching transients under hard-switching conditions. However, these transients are often limited by parasitic elements, especially by the source inductance and the parasitic capacitances of the power semiconductor. These limitations cannot be compensated by conventional gate drivers. To overcome this, a novel gate driver approach for power semiconductors was developed. It uses a transformer which accelerates the switching by transferring energy from the source path to the gate path.
Experimental results of the novel gate driver approach show a turn-on energy reduction of 78% (from 80 μJ down to 17 μJ) with a drain-source voltage of 500V and a drain current of 60 A. Furthermore, the efficiency improvement is demonstrated for a hard-switching boost converter. For a switching frequency of 750 kHz with an input voltage of 230V and an output voltage of 400V, it was possible to extend the output power range by 35%(from 2.3kW to 3.1 kW), due to the reduction of the turn-on losses, therefore lowering the junction temperature of the GaN-HEMT.
A novel gate driving approach to balance the transient current of parallel-connected GaN-HEMTs
(2018)
To enable higher current handling capability of GaN-based DC/DC converters, devices have to be used in parallel. However, their switching times differ, especially if their threshold voltages are not identical, which causes unbalanced device current. This paper focuses on the homogeneous distribution of turn-on switching losses of GaN-HEMTs connected in parallel. By applying a new gate driver concept, the transient current is distributed evenly. The effectiveness of this concept is demonstrated by double pulse measurements, for switching currents up to 45A and a voltage of 400V. A uniform current distribution is achieved, including a reduction of the turn-on losses by 50% compared to a conventional setup.
A new two-dimensional fluorescence sensor system was developed for in-line monitoring of mammalian cell cultures. Fluorescence spectroscopy allows for the detection and quantification of naturally occurring intra- and extracellular fluorophores in the cell broth. The fluorescence signals correlate the the cells' current redox state and other relevant process parameters. Cell culture pretests with twelve different excitation wavelengths showed that only three wavelengths account for a vast majority of spectral variation. Accordingly, the newly developed device utilizes three high-power LEDs as excitation sources in combination with a back-thinned CCD-spectrometer for fluorescence detection.
In contrast to IC design, MEMS design still lacks sophisticated component libraries. Therefore, the physical design of MEMS sensors is mostly done by simply drawing polygons. Hence, the sensor structure is only given as plain graphic data which hinders the identification and investigation of topology elements such as spring, anchor, mass and electrodes. In order to solve this problem, we present a rule-based recognition algorithm which identifies the architecture and the topology elements of a MEMS sensor. In addition to graphic data, the algorithm makes use of only a few marking layers, as well as net and technology information. Our approach enables RC-extraction with commercial field solvers and a subsequent synthesis of the sensor circuit. The mapping of the extracted RC-values to the topology elements of the sensor enables a detailed analysis and optimization of actual MEMS sensors.
In the upcoming years, huge benefits are expected from Artificial Intelligence (AI). However, there are also risks involved in the technology, such as accidents of autonomous vehicles or discrimination by AI-based recruitment systems. This study aims to investigate public perception of these risks, focusing on realistic risks of Narrow AI, i.e., the type of AI that is already productive today. Based on perceived risk theory, several risk scenarios are examined using data from an exploratory survey. This research shows that AI is perceived positively overall. The participants, however, do evaluate AI critically when being confronted with specific risk scenarios. Furthermore, a strong positive relationship between knowledge about AI and perceived risk could be shown. This study contributes to knowledge by advancing our understanding of the awareness and evaluation of the risks by consumers and has important implications for product development, marketing and society.
Critical size bone defects and non-union fractions are still challenging to treat. Cell-loaded bone substitutes have shown improved bone ingrowth and bone formation. However, a lack of methods for homogenously colonizing scaffolds limits the maximum volume of bone grafts. Additionally, therapy robustness is impaired by heterogeneous cell populations after graft generation. Our aim was to establish a technology for generating grafts with a size of 10.5 mm in diameter and 25 mm of height, and thus for grafts suited for treatment of critical size bone defects. Therefore, a novel tailor-made bioreactor system was developed, allowing standardized flow conditions in a porous poly(L-lactide co-caprolactone) material. Scaffolds were seeded with primary human mesenchymal stem cells derived from four different donors. In contrast to static experimental conditions, homogenous cell distributions were accomplished under dynamic culture. Additionally, culture in the bioreactor system allowed the induction of osteogenic lineage commitment after one week of culture without addition of soluble factors. This was demonstrated by quantitative analysis of calcification and gene expression markers related to osteogenic lineage. In conclusion, the novel bioreactor technology allows efficient and standardized conditions for generating bone substitutes that are suitable for the treatment of critical size defects in humans.
A single-phase fixed-frequency operated power factor correction circuit with reduced switching losses is proposed. The circuit uses the combination of a boost converter with an added clamp-switch, a pulse wave shaping circuit, and a standard control IC to discharge the transistor's output capacitance prior to its turn-on. In this way, a very low-complexity control circuit implementation to reduce switching losses or even achieve complete zero-voltage switching without additional sensors is possible. Moreover, this operation method is achieved at a constant switching frequency, possibly simplifying the design of the EMI filter and the converter's inductor. Experimental test results for a 100 W prototype converter are presented to validate the feasibility of the proposed operating method and corresponding circuit structure.
This work presents a spiral antenna array, which can be used in the V- and W-Band. An array equipped with Dolph-Chebychev coefficients is investigated to address issues related to the low gain and side lobe level of the radiating structure. The challenges encountered in this achievement are to provide an antenna that is not only good matched but also presents an appreciable effective bandwidth at the frequency bands of interest. Its radiation properties including the effective bandwidth and the gain are analyzed for the W-Band.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
A procedural approach to automate the manual design process in analog integrated circuit design
(2018)
This paper presents a novel approach to automating the design of analog integrated circuits: (1) the Expert Design Plan (EDP), a procedural generator, and (2) the EDP Language, a high-level description language for writing an EDP. An EDP is a parameterizable, executable script, which reproduces a designer’s course of action when designing a circuit. Thus, an EDP formalizes the design expert’s knowledge-based strategy and makes it reusable. Since it is essential that an EDP represents a circuit designers’ way of thinking and working as close as possible, the designers themselves should be enabled to create the EDP. Therefore, our approach provides a input method through a domain-specific language called EDP Language (EDPL). Using this language is intuitive and requires no special training. In an exemplary implementation of our approach, a common-source amplifier is automatically sized using a set of only 10 instructions. Even in the first usage our EDP approach has appeared to be more efficient than the manual sizing process.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
Motivation
In order to enable context-aware behavior of surgical assistance systems, the acquisition of various information about the current intraoperative situation is crucial. To achieve this, the complex task of situation recognition can be delegated to a specialized system. Consequently, a standardized interface is required for the seamless transfer of the recognized contextual information to the assistance systems, enabling them to adapt accordingly.
Methods
Our group analyzed four medical interface standards to determine their suitability for exchanging intraoperative contextual information. The assessment was based on a harmonized data and service model derived from the requirements of expected context-aware use cases. The Digital Imaging and Communications in Medicine (DICOM) and IEEE 11073 for Service-oriented Device Connectivity (SDC) were identified as the most appropriate standards.
Results
We specified how DICOM Unified Procedure Steps (UPS), can be used to effectively communicate contextual information. We proposed the inclusion of attributes to formalize different granularity levels of the surgical workflow.
Conclusions
DICOM UPS SOP classes can be used for the exchange of intraoperative contextual information between a situation recognition system and surgical assistance systems. This can pave the way for vendor-independent context awareness in the OR, leading to targeted assistance of the surgical team and an improvement of the surgical workflow.
DMOS transistors in integrated smart power technologies are often subject to cyclic power dissipation with substantial selfheating. This leads to repetitive thermo mechanical stress, causing fatigue of the on-chip metallization and limiting the lifetime. Hence, most designs use large devices for lower peak temperatures and thus reduced stress to avoid premature failures.
However, significantly smaller DMOS transistors are acceptable if the system reverts to a safer operating condition with lower stress when a failure is expected to occur in the near future. Hence, suitable early-warning sensors are required. This paper proposes a floating metal meander embedded between DMOS source and drain to detect an impending metallization failure. Measurement results of several variants will be presented and discussed, investigating their suitability as early warning indicators.
Documentation of clinical processes, especially in the perioperative are, is a base requirement for quality of service. Nonetheless, the documentation is a burden for the medical staff since it distracts from the clinical core process. An intuitive and user-friendly documentation system could increase documentation quality and reduce documentation workload. The optimal system solution would know what happened and the person documenting the step would need a single “confirm” button. In many cases, such a linear flow of activities is given as long as only one profession (e.g. anaestesiology, scrub nurse) is considered, but even in such cases, there might be derivations from the linear process flow and further interaction is required.
Multilevel-cell (MLC) flash is commonly deployed in today’s high density NAND memories, but low latency and high reliability requirements make it barely used in automotive embedded flash applications. This paper presents a time domain voltage sensing scheme that applies a dynamic voltage ramp at the cells’ control gate (CG) in order to achieve fast and reliable sensing suitable for automotive applications.
It is known that the costs related with drug research and development (R&D) and the timelines to develop a new drug increased over the past years. In parallel, the success rates of drug projects along the pharmaceutical R&D phases are still very low, and the outcome of all R&D efforts is stagnating. In consequence, the R&D efficiency defined as the financial investment per drug has been steadily decreasing. As innovation is the major growth driver of the pharmaceutical industry, reliable data on R&D efficiency and new concepts to overcome these challenges are of great interest for R&D managers and the sustainability of the pharmaceutical industry as a whole. This book chapter reviews publications on R&D performance indicators of the past years, such as the success rates and timelines per phase. Additionally, it illustrates the factors influencing the success rates, timelines, and costs of pharmaceutical R&D most and, thus, the denominators of the R&D efficiency.
This article is a review of the book "Brain computation as hierarchical abstraction" by Dana H. Ballard published by MIT press in 2015. The book series computational neuroscience familiarizes the reader with the computational aspects of brain functions based on neuroscientific evidence. It provides an excellent introduction of the functioning, i.e. the structure, the network and the routines of the brain in our daily life. The final chapters even discuss behavioral elements such as decision-making, emotions and consciousness. These topics are of high relevance in other sciences such as economics and philosophy. Overall, Ballard’s book stimulates a scientifically well-founded debate and, more importantly, reveals the need of an interdisciplinary dialogue towards social sciences.
This paper is a brief review on the book ‘Capital in the Twenty-First Century’ by the French scholar Thomas Piketty. The book has started a new debate about inequality and capital taxation in Europe. It provides interesting empirical facts and develops a theory of the functioning of capitalist economies. However, I personally think the book is less convincing than recognized in the public debate. The demonstrated theory of economic growth in the book is elusive and lacks a psychological and behavioral underpinning. In fact, I do think that the increasing inequality and economic divergence are caused by capitalism but the psychological and behavioral aspects of humans are of similar or greater significance. Therefore, Piketty’s argument does not stimulate an open and scientifically founded debate in all aspects.
This paper is a commentary on the book ‘Probability and stochastic processes’ from Ionut Florescu. The book is an excellent introduction to both probability theory and stochastic processes. It provides a comprehensive discussion of the main statistical concepts including the theorems and proofs. The introduction to probability theory is easy accessible and a perfect starting point for undergraduate students even with majors in other subjects than science, such as business or engineering. The book is also up-to-date because it includes programming code for simulations. However, the book has some weaknesses. It is less convincing in more advanced topics of stochastic theory and it does not include solutions to excises and recent research trends.
A lot of people need help in their daily life to wash, select and manage their clothing. The goal of this work is to design an assistant system (eKlarA) to support the user by giving recommendations to choose the clothing combinations, to find the clothing and to wash the clothing. The idea behind eKlarA is to generate a system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of the stations: closets, laundry basket and washing machine in one or several places. The system uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. The first prototype of this system has been developed and tested. The test results are presented in this work.
A seamless convergence of the digital and physical factory aiming in personalized Product Emergence Process (PPEP) for smart products within ESB Logistics Learning Factory at Reutlingen University.
A completely new business model with reference to Industrie4.0 and facilitated by 3D experience software in today's networked society in which customers expect immediate responses, delightful experience and simple solutions is one of the mission scenarios in the ESB Logistics Learning Factory at ESB Business School (Reutlingen University).
The business experience platform provides software solutions for every organization in the company respectively in the factory. An interface with dashboards, project management apps, 3D - design and construction apps with high end visualization, manufacturing and simulation apps as well as intelligence and social network apps in a collaborative interactive environment help the user to learn the creation of a value end to end process for a personalized virtual and later real produced product.
Instead of traditional ways of working and a conventional operating factory real workers and robots work semi-intuitive together. Centerpiece in the self-planned interim factory is the smart personalized product, uniquely identifiable and locatable at all times during the production process – a scooter with an individual colored mobile phone – holder for any smart phone produced with a 3D printer in lot size one. Smart products have in the future solutions incorporated internet based services – designed and manufactured - at the costs of mass products. Additionally the scooter is equipped with a retrievable declarative product memory. Monitoring and control is handled by sensor tags and a raspberry positioned on the product. The engineering design and implementation of a changeable production system is guided by a self-execution system that independently find amongst others esplanade workplaces.
The imparted competences to students and professionals are project management method SCRUM, customization of workflows by Industrie4.0 principles, the enhancements of products with new personalized intelligent parts, electrical and electronic selfprogrammed components and the control of access of the product memory information, to plan in a digital engineering environment and set up of the physical factory to produce customer orders. The gained action-orientated experience refers to the chances and requirements for holistic digital and physical systems.
In analog layout design, chip floorplans are usually still handcrafted by human experts. Particularly, the nondiscrete variability of block dimensions must be exploited thereby, which is a serious challenge for optimization-based algorithmic floorplanners. This paper presents a fundamentally new automation approach based on self-organization, in which floorplan blocks can autonomously move, rotate and deform themselves to jointly let compact results emerge from a synergistic flow of interaction. Our approach is able to minimize area and wirelength, supports nonslicing floorplan structures, can consider fully variable block dimensions, accounts for a fixed rectilinear boundary, and works absolutely deterministic. The approach is innovatively different from conventional, top-down oriented floorplanning algorithms.
Sleep quality and in general, behavior in bed can be detected using a sleep state analysis. These results can help a subject to regulate sleep and recognize different sleeping disorders. In this work, a sensor grid for pressure and movement detection supporting sleep phase analysis is proposed. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this project is a non invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable actigraphy devices tends to be uncomfortable. Besides this fact, they are also very expensive. The system represented in this work classifies respiration and body movement with only one type of sensor and also in a non invasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed the potential for classification of breathing rate and body movements. Although previous researches show the use of pressure sensors in recognizing posture and breathing, they have been mostly used by positioning the sensors between the mattress and bedsheet. This project however, shows an innovative way to position the sensors under the mattress.
This paper presents a measurement setup and an assembly technique suitable for characterization of power semiconductor devices under very high temperature conditions exceeding 500°C. An important application of this is the experimental investigation of wide bandgap semiconductors. Measurement results are shown for a 1200V SiC MOSFET and a 650V depletion mode GaN HEMT.
Background and purpose: Transapical aortic valve replacement (TAVR) is a recent minimally invasive surgical treatment technique for elderly and high-risk patients with severe aortic stenosis. In this paper,a simple and accurate image-based method is introduced to aid the intra-operative guidance of TAVR procedure under 2-D X-ray fluoroscopy.
Methods: The proposed method fuses a 3-D aortic mesh model and anatomical valve landmarks with live 2-D fluoroscopic images. The 3-D aortic mesh model and landmarks are reconstructed from interventional X-ray C-arm CT system, and a target area for valve implantation is automatically estimated using these aortic mesh models.Based on template-based tracking approach, the overlay of visualized 3-D aortic mesh model, land-marks and target area of implantation is updated onto fluoroscopic images by approximating the aortic root motion from a pigtail catheter motion without contrast agent. Also, a rigid intensity-based registration algorithm is used to track continuously the aortic root motion in the presence of contrast agent.Furthermore, a sensorless tracking of the aortic valve prosthesis is provided to guide the physician to perform the appropriate placement of prosthesis into the estimated target area of implantation.
Results: Retrospective experiments were carried out on fifteen patient datasets from the clinical routine of the TAVR. The maximum displacement errors were less than 2.0 mm for both the dynamic overlay of aortic mesh models and image-based tracking of the prosthesis, and within the clinically accepted ranges. Moreover, high success rates of the proposed method were obtained above 91.0% for all tested patient datasets.
Conclusion: The results showed that the proposed method for computer-aided TAVR is potentially a helpful tool for physicians by automatically defining the accurate placement position of the prosthesis during the surgical procedure.
A simple determination of the error voltage compensation map for motor parameter identification
(2018)
This paper proposes a new method for determining the error voltage compensation map in a parameter identification procedure of three-phase induction motors with an inverter. The compensation curve depending on the motor current is determined using a simple procedure based on given reference voltage steps and the corresponding steady state values of the stator current of the induction motor.
Many scientific reports have warned about the catastrophic consequences of unchecked climate change, with the latest international report calling for emissions of climate pollutants to reach net zero by around 2050 (IPCC, 2018). Limiting warming to 1.5°C could save more than 100 million people from water shortages, as many as 2 billion people from dangerous heatwaves, and the majority of species from climate change extinction risks (IPCC, 2018; Warren et al., 2018). The actions taken to achieve these climate outcomes would generate benefits of more than $20 trillion while easing global economic inequality (Burke et al., 2018). Scientists make it clear that it is physically possible to meet these goals using today’s technologies (Holz et al., 2018). Yet emissions of climate pollutants continue to grow, reaching a new record high in 2018 (Jackson et al., 2018). Clearly, scientific evidence has failed to spark needed climate action. The question now is: what can?
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
The usage of webinars is witnessing a resurgence in popularity, especially since the COVID-19 pandemic. It is now revealing itself to be an actual shift in how buyers and sellers do business in the future. Therefore, this study aims to investigate whether webinars will continue to be a valuable tool in B2B sales in the future. Specifically, it aims to gain a deeper understanding of how the role of webinars evolved in recent years and analyze its future potential.
Webinars are an essential tool for a wide variety of different use cases. While they have been around for over a decade, webinars recently have seen a resurgence in popularity. As the COVID-19 pandemic strictly limited contact between people and made them work from home as an only option, hosting and participating in webinars has become more common than ever - whether in business, education, or leisure.
Webinars can be effective for various purposes as they are held in real-time and allow multiple engagement opportunities between attendees and hosts. Moreover, because of their remote nature, webinars are more cost-effective and time-saving in organizing and supervising. Consequently, it is cheaper and more convenient to reach your desired target group as a webinar host than to hold a seminar in physical form. Among other reasons, convenience and interaction seem to be the most potent aspects of cementing webinars as a tool in the digital world. Nevertheless, where exactly are they used, and how do they create value in their respective usage fields?
Using predictive maintenance, more efficient processes can be implemented, leading to fewer maintenance costs and increased availability. The development of a predictive maintenance solution currently requires high efforts in time and capacity as well as often interdisciplinary cooperation. This paper presents a standardized model to describe a predictive maintenance use case. The description model is used to collect, present, and document the required information for the implementation of predictive maintenance use cases by and for different stakeholders. Based on this model, predictive maintenance solutions can be introduced more efficiently. The method is validated across departments in the automotive sector.
While academia and industry see large potential for human-robot collaboration (HRC), only a small number of realized HRC application is currently found in industry. To gather more data about current hindrances to wider implementation of collaborative robots, a study among 15 robot manufactureres and 14 system integrators of collaborative robot technology has been conducted through a predesigned questionnaire procedure. Additionally, five industrial users of human-robot collaboration have been interviewed on the main challenges they experienced during the initial implementation process. The quantitative data has been analyzed using the Wilcoxon-Signed-Rank-Test. Accoring to the study participants, the main challenges within the implementation currently are the identification of HRC-suitable processes, the application of relevant safety norms (such as ISO 10218, ISO/TS 15066) and the application-individual risk assessment.
This paper examines the determinants of Google search in the banking area. The weekly Google data from 2004 to 2013 used for this study consists of the 30 largest banks, the Federal Reserve, and the European Central Bank. To my knowledge, this is the first study on the determinants of Google data. Firstly the paper shows that Google searches are correlated with several performance variables and market data, such as asset prices and trading volume. Secondly it demonstrates that banks´ internal performance data has a major influence whereas market data is rather insignificant. Moreover it is shown that Google search for central banks is largely determined by the level of interest rates as well as the inflation and output gap. This is evidence that central bank attention is primarily driven by the policy targets. Accordingly Google data can be applied to analyze the timely impact of monetary policy.
Additive manufacturing is a key technology which applies the ideas of Industry 4.0 in order to enable the production of personalized and highly customized products economically. Especially small and medium sized companies often lack the competence and experience to evaluate objectively and profoundly the potential of additive manufacturing technologies in small and medium sized companies. Furthermore, the method has been validated in a small medical technology company evaluating the additive manufacturing potential of an existing surgery tool.
The cloud evolved into an attractive execution environment for parallel applications from the High Performance Computing (HPC) domain. Existing research recognized that parallel applications require architectural refactoring to benefit from cloud-specific properties (most importantly elasticity). However, architectural refactoring comes with many challenges and cannot be applied to all applications due to fundamental performance issues. Thus, during the last years, different cloud migration strategies have been considered for different classes of parallel applications. In this paper, we provide a survey on HPC cloud migration research. We investigate on the approaches applied and the parallel applications considered. Based on our findings, we identify and describe three cloud migration strategies.
The development of automatic solutions for the detection of physiological events of interest is booming. Improvements in the collection and storage of large amounts of healthcare data allow access to these data faster and more efficiently. This fact means that the development of artificial intelligence models for the detection and monitoring of a large number of pathologies is becoming increasingly common in the medical field. In particular, developing deep learning models for detecting obstructive apnea (OSA) events is at the forefront. Numerous scientific studies focus on the architecture of the models and the results that these models can provide in terms of OSA classification and Apnea-Hypopnea-Index (AHI) calculation. However, little focus is put on other aspects of great relevance that are crucial for the training and performance of the models. Among these aspects can be found the set of physiological signals used and the preprocessing tasks prior to model training. This paper covers the essential requirements that must be considered before training the deep learning model for obstructive sleep apnea detection, in addition to covering solutions that currently exist in the scientific literature by analyzing the preprocessing tasks prior to training.
The acquisition of data for reality mining applications is a critical factor, since many mobile devices, e.g. smartphones, must be capable of capturing the required data. Otherwise, only a small target group would be able to use the reality mining application. In the course of a survey, we have identified smartphone features which might be relevant for various reality mining applications. The survey classifies these features and shows how the support of each feature has changed over the years by analyzing 143 smartphones released between 2004 and 2015. All analyzed devices can be ranked by their number of provided features. Furthermore, this paper deals with quality issues which have occurred during carrying out the survey.
Many organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Socio-technical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle the complex business and IT architecture. The transformation of an organization's EA is influenced by big data projects and their data-driven approach on all layers. To enable strategy oriented development of the EA it is essential to synchronize these projects supported by EA management. In
this paper, we conduct a systematic review of big data literature to analyze which requirements for the EA management discipline are proposed. Thereby, a broad overview about existing research is presented to facilitate a more detailed exploration and to foster the evolution o the EA management discipline.
Blockchain is a technology for the secure processing and verification of data transactions based on a distributed peer-to-peer network that uses cryptographic processes, consensus algorithms, and backward-linked blocks to make transactions virtually immutable. Within supply chain management, blockchain technology offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. However, its complexity requires future employees to have comprehensive knowledge regarding the functionality of blockchain-based applications in order to be able to apply their benefits to scenarios in supply chain and production. Learning factories represent a suitable environment allowing learners to experience new technologies and to apply them to virtual and physical processes throughout value chains. This paper presents a concept to practically transfer knowledge about the technical functionality of blockchain technology to future engineers and software developers working within supply chains and production operations to sensitize them regarding the advantages of decentralized applications. First, the concept proposes methods to playfully convey immutable backward-linked blocks and the embedment of blockchain smart contracts. Subsequently, the students use this knowledge to develop blockchain-based application scenarios by means of an exemplary product in a learning factory environment. Finally, the developed solutions are implemented with the help of a prototypical decentralized application, which enables a holistic mapping of supply chain events.
Introduction: Telemedicine reduces greenhouse gas emissions (CO2eq); however, results of studies vary extremely in dependence of the setting. This is the first study to focus on effects of telemedicine on CO2 imprint of primary care.
Methods: We conducted a comprehensive retrospective study to analyze total CO2eq emissions of kilometers (km) saved by telemedical consultations. We categorized prevented and provoked patient journeys, including pharmacy visits. We calculated CO2eq emission savings through primary care telemedical consultations in comparison to those that would have occurred without telemedicine. We used the comprehensive footprint approach, including all telemedical cases and the CO2eq emissions by the telemedicine center infrastructure. In order to determine the net ratio of CO2eq emissions avoided by the telemedical center, we calculated the emissions associated with the provision of telemedical consultations (including also the total consumption of physicians’ workstations) and subtracted them from the total of avoided CO2eq emissions. Furthermore, we also considered patient cases in our calculation that needed to have an in-person visit after the telemedical consultation. We calculated the savings taking into account the source of the consumed energy (renewable or not).
Results: 433 890 telemedical consultations overall helped save 1 800 391 km in travel. On average, 1 telemedical consultation saved 4.15 km of individual transport and consumed 0.15 kWh. We detected savings in almost every cluster of patients. After subtracting the CO2eq emissions caused by the telemedical center, the data reveal savings of 247.1 net tons of CO2eq emissions in total and of 0.57 kg CO2eq per telemedical consultation. The comprehensive footprint approach thus indicated a reduced footprint due to telemedicine in primary care.
Discussion: Integrating a telemedical center into the health care system reduces the CO2 footprint of primary care medicine; this is true even in a densely populated country with little use of cars like Switzerland. The insight of this study complements previous studies that focused on narrower aspects of telemedical consultations.
For many companies, it is major international sporting events (in particular the Football World Cup or the Olympic Games) that constitute the ideal platform for the integration of their target group-specific marketing communication into an attractive sports environment. Sports event organizers sell exclusive marketing rights for their events to official sponsors, who, in return, acquire exclusive options to utilize the event for their own advertising purposes. Ambush marketing is the method used by companies that do not hold marketin rights to an event, but still use their marketing activities in diverse ways to establish a connection to it. There is still whidespread debate and confusion about the topic. Ambush marketing is often defined in different ways, by different people, according to their position as either supporters of opponents of the practice.
This white paper builds a new financial theory of euro area sovereign bond markets under stress. The theory explains the abnormal bond pricing and increasing spreads during the recent market turmoil. We find that the strong disconnect of bond spreads from the respective bonds’ underlying fundamental values in 2010 was triggered by an increase in asymmetric information and weak reputation of government policies. Both factors cause a normal bond market to switch into a crisis mode. Finally, those markets are prone to self-fulfilling bubbles in which the economic effects are amplified by herding behaviour arising from animal spirits. Altogether, this produces contagious effects and multiple equilibria. Thus, we argue that government bond markets in a monetary union are more fragile and vulnerable to liquidity and solvency crises. Consequently, the systemic mispricing of sovereign debt creates more macroeconomic instability and bubbles in the euro area than in a single country. In other words, financial markets are partly blind to national default risks in a currency union. Therefore, the current European institutional framework puts the wrong incentives in place and needs structural changes soon. To tackle the root causes we suggest more market incentives via consistent rules, pre-emptive austerity measures in good economic times, and a resolution scheme for heavily indebted countries. In summary, our paper enhances the bond market theory and provides new insights into the recent bond market turmoil in Europe.
This project aims to evaluate existing big data infrastructures for their applicability in the operating room to support medical staff with context-sensitive systems. Requirements for the system design were generated. The project compares different data mining technologies, interfaces, and software system infrastructures with a focus on their usefulness in the peri-operative setting. The lambda architecture was chosen for the proposed system design, which will provide data for both postoperative analysis and real-time support during surgery.
Context: Many companies are facing an increasingly dynamic and uncertain market environment, making traditional product roadmapping practices no longer sufficiently applicable. As a result, many companies need to adapt their product roadmapping practices for continuing to operate successfully in today’s dynamic market environment. However, transforming product roadmapping practices is a difficult process for organizations. Existing literature offers little help on how to accomplish such a process.
Objective: The objective of this paper is to present a product roadmap transformation approach for organizations to help them identify appropriate improvement actions for their roadmapping practices using an analysis of their current practices.
Method: Based on an existing assessment procedure for evaluating product roadmapping practices, the first version of a product roadmap transformation approach was developed in workshops with company experts. The approach was then given to eleven practitioners and their perceptions of the approach were gathered through interviews.
Results: The result of the study is a transformation approach consisting of a process describing what steps are necessary to adapt the currently applied product roadmapping practice to a dynamic and uncertain market environment. It also includes recommendations on how to select areas for improvement and two empirically based mapping tables. The interviews with the practitioners revealed that the product roadmap transformation approach was perceived as comprehensible, useful, and applicable. Nevertheless, we identified potential for improvements, such as a clearer presentation of some processes and the need for more improvement options in the mapping tables. In addition, minor usability issues were identified.
A vapor permeation processes for the separation of aromatic compounds from aliphatic compounds
(2014)
A number of rubbery and glassy membranes have been prepared and evaluated in vapor permeation experiments for separation of aromatic/aliphatic mixtures, using 5/95 (wt:wt) toluene/methylcyclohexane (MCH) as a model solution. Candidate membranes that met the required toluene/MCH selectivity of ≥ 10 were identified. The stability of the candidate membranes was tested by cycling the experiment between higher toluene concentrations and the original 5 wt% level. The best membrane produced has a toluene permeance of 280 gpu and a toluene/MCH selectivity of 13 when tested with a vapor feed of the model mixture at its boiling point and at atmospheric pressure. When a series of related membrane materials are compared, there is a sharp trade-off between membrane permeance and membrane selectivity. A process design study based on the experimental results was conducted. The best preliminary membrane design uses 45% of the energy of a conventional distillation process.
Size and function of bioartificial tissue models are still limited due to the lack of blood vessels and dynamic perfusion for nutrient supply. In this study, we evaluated the use of cytocompatible methacryl-modified gelatin for the fabrication of a hydrogel-based tube by dip-coating and subsequent photo-initiated cross-linking. The wall thickness of the tubes and the diameter were tuned by the degree of gelatin methacryl-modification and the number of dipping cycles. The dipping temperature of the gelatin solution was adjusted to achieve low viscous fluids of approximately 0.1 Pa s and was different for gelatin derivatives with different modification degrees. A versatile perfusion bioreactor for the supply of surrounding tissue models was developed, which can be adaped to several geometries and sizes of blood-vessel mimicking tubes. The manufactured bendable gelatin tubes were permeable for water and dissolved substances, like Nile Blue and serum albumin. As a proof of concept, human fibroblasts in a three-dimensional collagen tissue model were sucessfully supplied with nutrients via the central gelatin tube under dynamic conditions for 2 days. Moreover, the tubes could be used as scaffolds to build-up a functional and viable endothelial layer. Hence, the presented tools can contribute to solving current challenges in tissue engineering.
This paper compares the influence a video self-avatar and a lack of a visual representation of a body have on height estimation when standing at a virtual visual cliff. A height estimation experiment was conducted using a custom augmented reality Oculus Rift hardware and software prototype also described in this paper. The results show a consistency with previous research demonstrating that the presence of a visual body influences height estimates, just as it has been shown to influence distance estimates and affordance estimates.
Representing users within an immersive virtual environment is an essential functionality of a multi-person virtual reality system. Especially when communicative or collaborative tasks must be performed, there exist challenges about realistic embodying and integrating such avatar representations. A shared comprehension of local space and non-verbal communication (like gesture, posture or self-expressive cues) can support these tasks. In this paper, we introduce a novel approach to create realistic, video-texture based avatars of colocated users in real-time and integrate them in an immersive virtual environment. We show a straight forward and low-cost hard- and software solution to do so. We discuss technical design problems that arose during implementation and present a qualitative analysis on the usability of the concept from a user study, applying it to a training scenario in the automotive sector.
The demonstration project Virtual Power Plant Neckar-Alb is constructing a Virtual Power Plant (VPP) demonstration site at the Reutlingen University campus. The VPP demonstrator integrates a heterogeneous set of distributed energy resources (DERs) which are connected to control the infrastructure and an energy management system. This paper describes the components and the architecture of the demonstrator and presents strategies for demonstration of multiple optimization and control systems with different control paradigms.
In this work, a web-based software architecture and framework for management and diagnosis of large amounts of medical data in an ophthalmologic reading center is proposed. Data management for multi-center studies requires merging of standing data and repeatedly gathered clinical evidence such as vital signs and raw data. If ophthalmologic questions are involved the data acquisition is often provided by non-medical staff at the point of care or a study center, whereas the medical finding is mostly provided by an ophthalmologist in a specialized reading center. The study data such as participants, cohorts and measured values are administrated at a single data center for the entire study. Since a specialized reading center maintains several studies, the medical staff must learn the different data administration for the different data center. With respect to the increasing number and sizes of clinical studies, two aspects must be considered. At first, an efficient software framework is required to support the data management, processing and diagnosis by medical experts at the reading center. In the second place, this software needs a standardized user-interface that has not to be trained/taylore /adapted for each new study. Furthermore different aspects of quality and security controls have to be included. Therefore, the objective of this work is to establish a multi purpose ophthalmologic reading center, which can be connected to different data centers via configurable data interfaces in order to treat various topics simultaneously.
In this paper we describe an interactive web-based tool for visual analysis of Formula 1 data. A calendar-like representation provides an overview of all races on a yearly basis, either in absolute or normalized time. After selecting a dedicated race more details about this race can be explored. Furthermore it is possible to compare up to three different races. Beside visualizing details on dedicated races it is also possible to analyse driver and team performance over time. A user study was applied to get feedback about the usage of the application and decide between different visualization options.
Workflow driven support systems in the peri-operative area have the potential to optimize clinical processes and to allow new situation-adaptive support systems. We started to develop a workflow management system supporting all involved actors in the operating theatre with the goal to synchronize the tasks of the different stakeholders by giving relevant information to the right team members. Using the OMG standards BPMN, CMMN and DMN gives us the opportunity to bring established methods from other industries into the medical field. The system shows each addressed actor their information in the right place at the right time to make sure every member can execute their task in time to ensure a smooth workflow. The system has the overall view of all tasks. Accordingly, a workflow management system including the Camunda BPM workflow engine to run the models, and a middleware to connect different systems to the workflow engine and some graphical user interfaces to show necessary information or to interact with the system are used. The complete pipeline is implemented with a RESTful web service. The system is designed to include different systems like hospital information system (HIS) via the RESTful web service very easily and without loss of data. The first prototype is implemented and will be expanded.
Information systems, which support the workflow in the clinical area, are currently limited to organizational processes. This work shows a first approach of an information system supporting all actors in the perioperative area. The first prototype and proof of concept was a task manager, giving all actors information about their task and the task of all other actors during an intervention. Based on this initial task manager, we implemented an information system based on a workflow engine controlling all processes and all information necessary for the intervention. A second part was the development of a perioperative process visualization which was developed based on a user centered approach jointly with clinicians and OR members.
Three established test methods employed for evaluating the abrasion or wear resistance of textile materials were compared to gain deeper insight into the specific damaging mechanisms to better understand a possible comparability of the results of the different tests. The knowledge of these mechanisms is necessary for a systematic development of finishing agents improving the wear resistance of textiles. Martindale, Schopper, and Einlehner tests were used to analyze two different fabrics made of natural (cotton) or synthetic (polyethylene terephthalate) fibers, respectively. Samples were investigated by digital microscopy and scanning electron microscopy to visualize the damage. Damage symptoms are compared and discussed with respect to differences in the damaging mechanisms.
Abreinigbare Schlauchfilter kommen zur Abscheidung von Stäuben sowie staubförmigen Substanzen zum Einsatz. Aufgrund typischer Prozessbedingungen unterliegen sie während ihres Einsatzes thermischer, chemischer und mechanischer Beanspruchung. Das IGF-Projekt Nr. 18307 "Untersuchung der chemischen und thermischen Degradation von abreinigbaren Filtermedien und Verbesserung deren Beständigkeit durch Oberflächenmodifikation" hat mehrere Prüfmethoden verglichen.
Die weiterhin hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation erst gar nicht eintritt, hält der Autor eine staatliche Insovenzordnung – mit Bail-out durch die anderen Mitgliedstaaten nur in Notfällen – für erforderlich. Er schlägt einen staatlichen Abwicklungsmechanismus für überschuldete Euro-Länder vor, der auf einem Konzept des Sachverständigenrates für Wirtschaft von 2016 beruht.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
The influence of trust on the adherence to investment recommendations in the context of robo-advisors is under-researched. This relationship needs to be better understood because robo-advice lacks a critical element of trust: human interaction. Theory suggests that ability, integrity, and benevolence are key factors in building trust in human advisors. Using an experimental study design, our research examines the relationship between a robo-advisor's trust attributes and the acceptance of its investment advice. The results show that trust in a robo-advisor increases the propensity to follow its recommendations. While ability and integrity are significant, benevolence is not. The study contributes to the research on technology acceptance, trust, and the adoption of technology-based recommendations by improving the understanding of the relationship between trust and the acceptance of automated investment recommendations.
Lean Management hat in viele Unternehmen Einzug gehalten. Lean Konzepte stellen neue Anforderungen an die Art und Struktur der benötigten Kosteninformation, welche von traditionallen Kostenrechnungssystemen nicht unmittelbar erfüllt werden. Vertreter eines „Lean Accounting“ schlagen deshalb teils radikale Änderungen und eine Vereinfachung der Kostenrechnung vor. Der Beitrag diskutiert die Beschränkungen der traditionellen Kostenrechnung bei der Umsetzung von Lean Management und stellt ausgewählte Ansätze eines „Accounting for Lean“ vor. Die Analyse zeigt, dass Ansätze des Lean Accounting zu eng fokussiert sind und die in der Praxis vorhandene Pluralität der Kostenrechnungsfunktionen nicht adäquat abbilden können. Eine radikale Neugestaltung bestehender Kostenrechnungssysteme wird deshalb als unrealistisch und unbegründet verworfen. Der Beitrag entwickelt alternative Vorschläge, wie Konzepte des Lean Managements und die dafür benötigte Kosteninformation in traditionellen Kostenrechnungssystemen integriert werden können.
Glioblastoma WHO IV belongs to a group of brain tumors that are still incurable. A promising treatment approach applies photodynamic therapy (PDT) with hypericin as a photosensitizer. To generate a comprehensive understanding of the photosensitizer-tumor interactions, the first part of our study is focused on investigating the distribution and penetration behavior of hypericin in glioma cell spheroids by fluorescence microscopy. In the second part, fluorescence lifetime imaging microscopy (FLIM) was used to correlate fluorescence lifetime (FLT) changes of hypericin to environmental effects inside the spheroids. In this context, 3D tumor spheroids are an excellent model system since they consider 3D cell–cell interactions and the extracellular matrix is similar to tumors in vivo. Our analytical approach considers hypericin as probe molecule for FLIM and as photosensitizer for PDT at the same time, making it possible to directly draw conclusions of the state and location of the drug in a biological system. The knowledge of both state and location of hypericin makes a fundamental understanding of the impact of hypericin PDT in brain tumors possible. Following different incubation conditions, the hypericin distribution in peripheral and central cryosections of the spheroids were analyzed. Both fluorescence microscopy and FLIM revealed a hypericin gradient towards the spheroid core for short incubation periods or small concentrations. On the other hand, a homogeneous hypericin distribution is observed for long incubation times and high concentrations. Especially, the observed FLT change is crucial for the PDT efficiency, since the triplet yield, and hence the O2 activation, is directly proportional to the FLT. Based on the FLT increase inside spheroids, an incubation time 30 min is required to achieve most suitable conditions for an effective PDT.
Estimating molar solubility from the Hildebrand-Scott relation employing Hansen solubility parameters (HSP) is widely presumed a valid semi quantitative approach. To test this presumption and to determine quantitatively the inherent accuracy of such a solubility prognosis, l-ascorbic acid (LAA) was treated as an example of a commercially important solute. Analytical calculus and Monte Carlo (MC) simulation were performed for 20 common solvents with total HSP ranging from 14.5 to 33.0 (MPa)0.5 utilizing validated material data. It was found that, due to the uncertainty of the material data used in the calculations, the solubility prediction had a large scattering and, thus, a low precision.
Reliable and accurate car driver head pose estimation is an important function for the next generation of advanced driver assistance systems that need to consider the driver state in their analysis. For optimal performance, head pose estimation needs to be non-invasive, calibration-free and accurate for varying driving and illumination conditions. In this pilot study we investigate a 3D head pose estimation system that automatically fits a statistical 3D face model to measurements of a driver’s face, acquired with a low-cost depth sensor on challenging real-world data. We evaluate the results of our sensor-independent, driver-adaptive approach to those of a state-of-the-art camera-based 2D face tracking system as well as a non-adaptive 3D model relative to own ground-truth data, and compare to other 3D benchmarks. We find large accuracy benefits of the adaptive 3D approach.
The pH value of the human skin is not in the neutral range but is slightly acidic with values of – depending on the body part – 3.5 to 6. This provides a suitable habitat for the commensal skin floral but has a killing effect on some pathogenic micro-organisms and an inactivating effect on some viruses. This protective acid mantle of the skin thus represents a first external protective layer against infestation by pathogens. An appropriate surface pH on textiles can help to minimize the transmission of pathogens through the clothing of healthcare workers while at the same time not exerting a negative influence on the skin’s own flora. In addition, the colonization of e.g. bed linen by pathogenic microorganisms can be reduced. This can also have a positive influence on bacteria-associated odor formation on functional clothing.
The pH value of the human skin is not in the neutral range but is slightly acidic with values of – depending on the body part – 3.5 to 6. This provides a suitable habitat for the commensal skin floral but has a killing effect on some pathogenic micro-organisms and an inactivating effect on some viruses. This protective acid mantle of the skin thus represents a first external protective layer against infestation by pathogens. An appropriate surface pH on textiles can help to minimize the transmission of pathogens through the clothing of healthcare workers while at the same time not exerting a negative influence on the skin’s own flora. In addition, the colonization of e.g. bed linen by pathogenic microorganisms can be reduced. This can also have a positive influence on bacteria-associated odor formation on functional clothing.
We investigated the excitation modes of the light-harvesting protein phycocyanin (PC) from Thermosynechococcus vulcanus in the crystalline state using UV and near-infrared Raman spectroscopy. The spectra revealed the absence of a hydrogen out-of-plane wagging (HOOP) mode in the PC trimer, which suggests that the HOOP mode is activated in the intact PC rod, while it is not active in the PC trimer. Furthermore, in the PC trimer an intense mode at 984 cm−1 is assigned to the C–C stretching vibration while the mode at 454 cm−1 is likely due to ethyl group torsion. In contrast, in the similar chromophore phytochromobilin the C5,10,15-D wag mode at 622 cm−1 does not come from a downshift of the HOOP. Additionally, the absence of modes between 1200 and 1300 cm−1 rules out functional monomerization. A correlation between phycocyanobilin (PCB) and phycoerythrobilin (PEB) suggests that the PCB cofactors of the PC trimer appear in a conformation similar to that of PEB. The conformation of the PC rod is consistent with that of the allophycocyanin (APC) trimer, and thus excitonic flow is facilitated between these two independent light harvesting compounds. This excitonic flow from the PC rod to APC appears to be modulated by the vibration channels during HOOP wagging, C = C stretching, and the N–H rocking in-plan vibration.
This paper illustrates the implementation of series connected hardware modules as part of a scalable and modular power electronics device, which is ideally suited in the field of electric vehicles using wide bandgap semiconductor devices. The main benefit of the modular concept is that different current or voltage requirements can be satisfied based on the appropriate series or parallel connection of single modules. The particular design is based on the fact that the single modules generate a continuous and specified output voltage from a given dc voltage. The current work focuses on a brief classification of this work in different series connected concepts of power converters and in particular on an active damping approach for the series connected LC output filters based on inductor current feedback.
This paper presents a modular and scalable power electronics concept for motor control with continuous output voltage. In contrast to multilevel concepts, modules with continuous output voltage are connected in series. The continuous output voltage of each module is obtained by using gallium nitride (GaN) high electron motility transistor (HEMT)s as switches inside the modules with a switching frequency in the range between 500 kHz and 1 MHz. Due to this high switching frequency a LC filter is integrated into the module resulting in a continuous output voltage. A main topic of the paper is the active damping of this LC output filter for each module and the analysis of the series connection of the damping behaviour. The results are illustrated with simulations and measurements.
This contribution presents a three-phase power stage for motor control with continuous output voltages using wide bandgap semiconductors and an asynchronous delta-sigma based switching signal generation. The focus of the paper is on an active damping approach for the LC output filter based on inductor current feedback.
The spreading area of cells has been shown to play a central role in the determination of cell fate and tissue morphogenesis; however, a clear understanding of how spread cell area is determined is still lacking. The observation that cell area and force generally increase with substrate rigidity suggests that cell area is dictated mechanically, by means of a force-balance between the cell and the substrate. A simple mechanical model, corroborated by experimental measurements of cell area and force is presented to analyze the temporal force balance between the cell and the substrate during spreading. The cell is modeled as a thin elastic disc that is actively pulled by lamellipodia protrusions at the cell front. The essential molecular mechanisms of the motor activity at the cell front, including, actin polymerization, adhesion kinetics, and the actin retrograde flow, are accounted for and used to predict the dynamics of cell spreading on elastic substrates; simple, closed-form expressions for the evolution of cell size and force are derived. Time-resolved, traction force microscopy, combined with measurements of cell area are performed to investigate the simultaneous variations of cell size and force. We find that cell area and force increase simultaneously during spreading but the force develops with an apparent delay relative to the increase in cell area. We demonstrate that this may reflect the strain-stiffening property of the cytoskeleton. We further demonstrate that the radial cell force is a concave function of spreading speed and that this may reflect the strengthening of cell–substrate adhesions during spreading.
Active storage
(2018)
In brief, Active Storage refers to an architectural hardware and software paradigm, based on collocation storage and compute units. Ideally, it will allow to execute application-defined data ... within the physical data storage. Thus Active Storage seeks to minimize expensive data movement, improving performance, scalability, and resource efficiency. The effective use of Active Storage mandates new architectures, algorithms, interfaces, and development toolchains.
The investigation of stress requires to distinguish between stress caused by physical activity and stress that is caused by psychosocial factors. The behaviour of the heart in response to stress and physical activity is very similar in case the set of monitored parameters is reduced to one. Currently, the differentiation remains difficult and methods which only use the heart rate are not able to differentiate between stress and physical activity, without using additional sensor data input. The approach focusses on methods which generate signals providing characteristics that are useful for detecting stress, physical activity, no activity and relaxation.
The following paper is dealing with the issue on which actual consumer lifestyle segmentation methods there are for particular European countries and accordingly for Europe as a whole. This is important for corporations to be able to place their products accurately by a consumer orientated marketing concerning the constant change of values and minds. Researching current literature, internet sources and documents, the state of the science is presented by a detailed description of the most popular lifestyle segmentation methods used in European countries. In addition to that, these instruments are discussed individually and then compared to each other. All instruments, the Sinus-Milieus, Euro-Socio-Styles, Roper-Consumer-Styles, RISC and Mosaic, are serving the same purpose even so they differ pretty much from each other. Each market research company has its own method to generate their model just as different segments and definitions for them. Furthermore every segmentation method is illustrated in a different way. This paper demonstrates all these instruments in detail and shows its advantages and disadvantages. Summing up literature research concerning the main research question, there are several models segmenting consumers in different lifestyle groups for e.g. in Germany, France or Great Britain, but still less models referring to the entire European market.
In dieser Arbeit wird ein Ansatz zur Unterstützung von Werkern, Meistern und Instandhaltern vorgestellt, der es ermöglicht, aus der auftretenden Situation heraus (ad hoc), auf aktuelle notwendige Informationen und die Zusammenhänge in einer variantenreichen Serienfertigung zuzugreifen. Schwerpunkt bildet das unternehmensneutrale Gesamtkonzept des fertigungsnahen Kontextinformationssystems, das aus dem Produktionsumgebungsmodell und der Systemarchitektur besteht. Das Produktionsumgebungsmodell beschreibt und vernetzt enthaltene Informationen und Zusammenhänge einer variantenreichen Serienfertigung. Hauptordnungskriterien sind hier die Zugehörigkeit zu einer bestimmten Gruppe (Typ), die Identität eines Gegenstands, dessen Ort und Betriebszustand über die Zeit. Die Systemarchitektur ist modular aufgebaut. Die Module werden in Erfassungsmodule, Kontextverwaltungsmodule, Funktionsmodule zur automatischen und manuellen Informationsfilterung sowie Präsentationsmodule untergliedert und kommunizieren über eine einheitliche Schnittstelle.
The fast moving process of digitization1 demands flexibility in order to adapt to rapidly changing business requirements and newly emerging business opportunities. New features have to be developed and deployed to the production environment a lot faster. To be able to cope with this increased velocity and pressure, a lot of software developing companies have switched to a Microservice Architecture (MSA) approach. Applications built this way consist of several fine-grained and heterogeneous services that are independently scalable and deployable. However, the technological and business architectural impacts of microservices based applications directly affect their integration into the digital enterprise architecture. As a consequence, traditional Enterprise Architecture Management (EAM) approaches are not able to handle the extreme distribution, diversity, and volatility of micro-granular systems and services. We are therefore researching mechanisms for dynamically integrating large amounts of microservices into an adaptable digital enterprise architecture.
SmartLife ecosystems are emerging as intelligent user-centered systems that will shape future trends in technology and communication. Biological metaphors of living adaptable ecosystems provide the logical foundation for self-optimizing and self-healing run-time environments for intelligent adaptable business services and related information systems with service-oriented enterprise architectures. The present research in progress work investigates mechanisms for adaptable enterprise architectures for the development of service-oriented ecosystems with integrated technologies like Semantic Technologies, Web Services, Cloud Computing and Big Data Management. With a large and diverse set of ecosystem services with different owners, our scenario of service-based SmartLife ecosystems can pose challenges in their development, and more importantly, for maintenance and software evolution. Our research explores the use of knowledge modeling using ontologies and flexible metamodels for adaptable enterprise architectures to support program comprehension for software engineers during maintenance and evolution tasks of service-based applications. Our previous reference enterprise architecture model ESARC -- Enterprise Services Architecture Reference Cube -- and the Open Group SOA Ontology was extended to support agile semantic analysis, program comprehension and software evolution for a SmartLife applications scenario. The Semantic Browser is a semantic search tool that was developed to provide knowledge-enhanced investigation capabilities for service-oriented applications and their architectures.
Adaptation of the business model canvas template to develop business models for the circular economy
(2021)
The Business Model Canvas as a template for strategic management serves the development of new or the documentation of existing linear business models. However, the change towards a Circular Economy requires new value creation structures and thus changed business models. To develop business models for circular economies, it is necessary to adapt the existing template, since the actors involved along the value chain take on changed roles. In the context of this paper, a template is presented, based on the existing Business Model Canvas, which allows to develop and document business models for a Circular Economy.
The basic idea behind a wearable robotic grasp assistancesystem is to support people that suffer from severe motor impairments in daily activities. Such a system needs to act mostly autonomously and according to the user’s intent. Vision-based hand pose estimation could be an integral part of a larger control and assistance framework. In this paper we evaluate the performance of egocentric monocular hand pose estimation for a robot-controlled hand exoskeleton in a simulation. For hand pose estimation we adopt a Convolutional Neural Network (CNN). We train and evaluate this network with computer graphics, created by our own data generator. In order to guide further design decisions we focus in our experiments on two egocentric camera viewpoints tested on synthetic data with the help of a 3D-scanned hand model, with and without an exoskeleton attached to it.We observe that hand pose estimation with a wrist-mounted camera performs more accurate than with a head-mounted camera in the context of our simulation. Further, a grasp assistance system attached to the hand alters visual appearance and can improve hand pose estimation. Our experiment provides useful insights for the integration of sensors into a context sensitive analysis framework for intelligent assistance.
Big Data und Cloud Systeme werden zunehmend von mobilen, benutzerzentrierten und agil veränderbaren Informationssystemen im Kontext von digitalen sozialen Netzwerken genutzt. Metaphern aus der Biologie für lebendige und selbstheilende Systeme und Umgebungen liefern die Basis für intelligente adaptive Informationssysteme und für zugehörige serviceorientierte digitale Unternehmensarchitekturen. Wir berichten über unsere Forschungsarbeiten über Strukturen und Mechanismen adaptiver digitaler Unternehmensarchitekturen für die Entwicklung und Evolution von serviceorientierten Ökosystemen und deren Technologien wie Big Data, Services & Cloud Computing, Web Services und Semantikunterstützung. Für unsere aktuellen Forschungsarbeiten nutzen wir praxisrelevante SmartLife Szenarien für die Entwicklung, Wartung und Evolution zukunftsgerechter serviceorientierter Informationssysteme. Diese Systeme nutzen eine stark wachsende Zahl externer und interner Services und fokussieren auf die Besonderheiten der Weiterentwicklung der Informationssysteme für integrierte Big Data und Cloud Kontexte. Unser Forschungsansatz beschäftigt sich mit der systematischen und ganzheitlichen Modellbildung adaptiver digitaler Unternehmensarchitekturen - gemäß standardisierter Referenzmodelle und auf Standards aufsetzenden Referenzarchitekturen, die für besondere Einsatzszenarien auch bei kleineren Anwendungskontexten oder an neue Kontexte einfacher adaptiert werden können. Um Semantik-gestützte Analysen zur Entscheidungsunterstützung von System- und Unternehmensarchitekten zu ermöglichen, erweitern wir unser bisheriges Referenzmodell für ITUnternehmensarchitekturen ESARC – Enterprise Services Architecture Reference Cube – um agile Mechanismen der Adaption und Konsistenzbehandlung sowie die zugehörigen Metamodelle und Ontologien für Digitale Enterprise Architekturen um neue Aspekte wie Big Data und Cloud Kontexte.
The Internet of Things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems with service-oriented enterprise architectures. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight with the example domain – Internet of Things.