Refine
Year of publication
- 2017 (209) (remove)
Document Type
- Conference proceeding (103)
- Journal article (77)
- Book chapter (17)
- Book (6)
- Working Paper (3)
- Doctoral Thesis (1)
- Issue of a journal (1)
- Review (1)
Has full text
- yes (209) (remove)
Is part of the Bibliography
- yes (209)
Institute
- Informatik (73)
- ESB Business School (54)
- Technik (38)
- Texoversum (24)
- Life Sciences (19)
Publisher
- Springer (45)
- IEEE (27)
- Hochschule Reutlingen (21)
- Elsevier (16)
- Gesellschaft für Informatik e.V (15)
- Association for Computing Machinery (8)
- Association for Information Systems (5)
- Wiley (5)
- Università Politecnica delle Marche (4)
- American Marketing Association (3)
Real estate markets are known to fluctuate. The real estate market in Stuttgart, Germany, has been booming for more than a decade: square-meter price hit top levels and real estate agents claim that market prices will continue to increase. In this paper, we test this market understanding by developing and analyzing a system dynamics model that depicts the Stuttgart real estate market. Simulating the model explains oscillating behavior arising from significant time delays and endogenous feedback structures – and not necessarily oscillating interest rates, as market experts assume. Scenarios provide insights into the system's behavior reacting to changes exogenous to the model. The first scenario tests the market development under increasing interest rates. The other scenario deals with possible effects on the real estate market if the regional automotive economy suffers from intense competition with new market players entering with alternative fuel vehicles and new technologies. With a policy run we test market structure changes to eliminate cyclical effects. The paper confirms that the business cycle in the Stuttgart real estate market arises from within the system's underlying structure, thus emphasizing the importance of understanding feedback structures.
Modern power transistors are able to switch at very high transition speed, which can cause EMC violations and overshoot. This is addressed by a gate driver with variable gate current, which is able to control the transition speed. The key idea is that the gate driver can influence the di/dt and dv/dt transition separately and optimize whichever transition promises the highest improvement while keeping switching losses low. To account for changes in the load current, supply voltage, etc., a control loop is required in the driver to ensure optimized switching. In this paper, an efficient control scheme for an automotive gate driver with variable output current capability is presented. The effectiveness of the control loop is demonstrated for a MOSFET bridge consisting of OptiMOS-T2™devices with a total gate charge of 39nC. This bridge setup shows dv/dt transitions between 50 to 1000ns, depending on driving current. The driver is able to switch between gate current levels of 1 to 500mA in 10/15ns (rising/falling transition). With the implemented control loop the driver is measured to significantly reduce the ringing and thereby reduce device stress and electromagnetic emissions while keeping switching losses 52% lower than with a constant current driver.
Software and system development is complex and diverse, and a multitude of development approaches is used and combined with each other to address the manifold challenges companies face today. To study the current state of the practice and to build a sound understanding about the utility of different development approaches and their application to modern software system development, in 2016, we launched the HELENA initiative. This paper introduces the 2nd HELENA workshop and provides an overview of the current project state. In the workshop, six teams present initial findings from their regions, impulse talk are given, and further steps of the HELENA roadmap are discussed.
More and more power electronics applications utilize GaN transistors as they enable higher switching frequencies in comparison to conventional Si devices. Faster switching shrinks down the size of passives and enables compact solutions in applications like renewable energy, electrical cars and home appliances. GaN transistors benefit from ~10× smaller gate charge QG and gate drive voltages in the range of typically 5V vs. ~15V for Si.
The presented wide-Vin step-down converter introduces a parallel-resonant converter (PRC), comprising an integrated 5-bit capacitor array and a 300 nH resonant coil, placed in parallel to a conventional buck converter. Unlike conventional resonant concepts, the implemented soft-switching control eliminates input voltage dependent losses over a wide operating range. This ensures high efficiency across a wide range of Vin= 12-48V, 100-500mA load and 5V output at up to 15MHz switching frequency. The peak efficiency of the converter is 76.3 %. Thanks to the low output current ripple, the output capacitor can be as small as 50 nF, while the inductor tolerates a larger ESR, resulting in small component size. The proposed PRC architecture is also suitable for future power electronics applications using fast-switching GaN devices.
A 3D face modelling approach for pose-invariant face recognition in a human-robot environment
(2017)
Face analysis techniques have become a crucial component of human-machine interaction in the fields of assistive and humanoid robotics. However, the variations in head-pose that arise naturally in these environments are still a great challenge. In this paper, we present a real-time capable 3D face modelling framework for 2D in-the-wild images that is applicable for robotics. The fitting of the 3D Morphable Model is based exclusively on automatically detected landmarks. After fitting, the face can be corrected in pose and transformed back to a frontal 2D representation that is more suitable for face recognition. We conduct face recognition experiments with non-frontal images from the MUCT database and uncontrolled, in the wild images from the PaSC database, the most challenging face recognition database to date, showing an improved performance. Finally, we present our SCITOS G5 robot system, which incorporates our framework as a means of image pre-processing for face analysis.
How to protect the skin from getting sun burnt? The sun can damage your skin e.g. skin cancer. But the sun has a positive effect to the human. The time in sun and the intensity are key values between enjoy the sunbath and having a negative effect to the skin. A smart device like a UV flower could help you to enjoy the sunbath. It measures the UV index around you and gives this information to a smartphone app. The development steps of such a device are described in this paper. The UV flower is made of textile fabrics.
The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology.
A gate driver approach is presented for the reduction of turn-on losses in hard switching applications. A significant turn-on loss reduction of up to 55% has been observed for SiCMOSFETs. The gate driver approach uses a transformer which couples energy from the power path back into the gate path during switching events, providing increased gate driver current and thereby faster switching speed.
The gate driver approach was tested on a boost converter running at a switching frequency up to 300 kHz. With an input voltage of 300V and an output voltage of 600V, it was possible to reduce the converter losses by 8% at full load. Moreover, the output power range could be extended by 23% (from 2.75kW to 3.4 kW) due to the reduction of the turn-on losses.
This work presents a fully integrated GaN gate driver in a 180nm HV BCD technology that utilizes high-voltage energy storing (HVES) in an on-chip resonant LC tank, without the need of any external capacitor. It delivers up to 11nC gate charge at a 5V GaN gate, which exceeds prior art by a factor of 45-83, supporting a broad range of GaN transistor types. The stacked LC tank covers an area of only 1.44mm², which corresponds to a superior value of 7.6nC/mm².
We present a new methodology for automatic selection and sizing of analog circuits demonstrated on the OTA circuit class. The methodology consists of two steps: a generic topology selection method supported by a “part-sizing” process and subsequent final sizing. The circuit topologies provided by a reuse library are classified in a topology tree. The appropriate topology is selected by traversing the topology tree starting at the root node. The decision at each node is gained from the result of the part-sizing, which is in fact a node-specific set of simulations. The final sizing is a simulation-based optimization. We significantly reduce the overall simulation effort compared to a classical simulation-based optimization by combining the topology selection with the part-sizing process in the selection loop. The result is an interactive user friendly system, which eases the analog designer’s work significantly when compared to typical industrial practice in analog circuit design. The topology selection method and sizing process are implemented as a tool into a typical analog design environment. The design productivity improvement achievable by our method is shown by a comparison to other design automation approaches.
A new method for the analysis of movement dependent parasitics in full custom designed MEMS sensors
(2017)
Due to the lack of sophisticated microelectromechanical systems (MEMS) component libraries, highly optimized MEMS sensors are currently designed using a polygon driven design flow. The strength of this design flow is the accurate mechanical simulation of the polygons by finite element (FE) modal analysis. The result of the FE-modal analysis is included in the system model together with the data of the (mechanical) static electrostatic analysis. However, the system model lacks the dynamic parasitic electrostatic effects, arising from the electric coupling between the wiring and the moving structures. In order to include these effects in the system model, we present a method which enables the quasi dynamic parasitic extraction with respect to in-plane movements of the sensor structures. The method is embedded in the polygon driven MEMS design flow using standard EDA tools. In order to take the influences of the fabrication process into account, such as etching process variations, the method combines the FE-modal analysis and the fabrication process simulation data. This enables the analysis of dynamic changing electrostatic parasitic effects with respect to movements of the mechanical structures. Additionally, the result can be included into the system model allowing the simulation of positive feedback of the electrostatic parasitic effects to the mechanical structures.
This paper introduces a novel placement methodology for a common-centroid (CC) pattern generator. It can be applied to various integrated circuit (IC) elements, such as transistors, capacitors, diodes, and resistors. The proposed method consists of a constructive algorithm which generates an initial, close to the optimum, solution, and an iterative algorithm which is used subsequently, if the output of constructive algorithm does not satisfy the desired criteria. The outcome of this work is an automatic CC placement algorithm for IC element arrays. Additionally, the paper presents a method for the CC arrangement evaluation. It allows for evaluating the quality of an array, and a comparison of different placement methods.
A novel configuration of the dual active bridge (DAB) DC/DC converter is presented, enabling more efficient wide voltage range conversion at light loads. A third phase leg as well as a center tapped transformer are introduced to one side of the converter. This concept provides two different turn ratios, thus extending the zero voltage switching operation resulting in higher efficiency. A laboratory prototype was built converting an input voltage of 40V to an output voltage in the range of 350V to 650V. Measurements show a significant increase up to 20% in the efficiency for light-load operation.
Modern power semiconductor devices have low capacitances and can therefore achieve very fast switching transients under hard-switching conditions. However, these transients are often limited by parasitic elements, especially by the source inductance and the parasitic capacitances of the power semiconductor. These limitations cannot be compensated by conventional gate drivers. To overcome this, a novel gate driver approach for power semiconductors was developed. It uses a transformer which accelerates the switching by transferring energy from the source path to the gate path.
Experimental results of the novel gate driver approach show a turn-on energy reduction of 78% (from 80 μJ down to 17 μJ) with a drain-source voltage of 500V and a drain current of 60 A. Furthermore, the efficiency improvement is demonstrated for a hard-switching boost converter. For a switching frequency of 750 kHz with an input voltage of 230V and an output voltage of 400V, it was possible to extend the output power range by 35%(from 2.3kW to 3.1 kW), due to the reduction of the turn-on losses, therefore lowering the junction temperature of the GaN-HEMT.
This work presents a spiral antenna array, which can be used in the V- and W-Band. An array equipped with Dolph-Chebychev coefficients is investigated to address issues related to the low gain and side lobe level of the radiating structure. The challenges encountered in this achievement are to provide an antenna that is not only good matched but also presents an appreciable effective bandwidth at the frequency bands of interest. Its radiation properties including the effective bandwidth and the gain are analyzed for the W-Band.
Multilevel-cell (MLC) flash is commonly deployed in today’s high density NAND memories, but low latency and high reliability requirements make it barely used in automotive embedded flash applications. This paper presents a time domain voltage sensing scheme that applies a dynamic voltage ramp at the cells’ control gate (CG) in order to achieve fast and reliable sensing suitable for automotive applications.
Sleep quality and in general, behavior in bed can be detected using a sleep state analysis. These results can help a subject to regulate sleep and recognize different sleeping disorders. In this work, a sensor grid for pressure and movement detection supporting sleep phase analysis is proposed. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this project is a non invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable actigraphy devices tends to be uncomfortable. Besides this fact, they are also very expensive. The system represented in this work classifies respiration and body movement with only one type of sensor and also in a non invasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed the potential for classification of breathing rate and body movements. Although previous researches show the use of pressure sensors in recognizing posture and breathing, they have been mostly used by positioning the sensors between the mattress and bedsheet. This project however, shows an innovative way to position the sensors under the mattress.
Die weiterhin hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation erst gar nicht eintritt, hält der Autor eine staatliche Insovenzordnung – mit Bail-out durch die anderen Mitgliedstaaten nur in Notfällen – für erforderlich. Er schlägt einen staatlichen Abwicklungsmechanismus für überschuldete Euro-Länder vor, der auf einem Konzept des Sachverständigenrates für Wirtschaft von 2016 beruht.
Propofol in exhaled breath can be measured and may provide a real-time estimate of plasma concentration. However, propofol is absorbed in plastic tubing, thus estimates may fail to reflect lung/blood concentration if expired gas is not extracted directly from the endotracheal tube.We evaluated exhaled propofol in five ventilated ICU patients who were sedated with propofol. Exhaled propofol was measured once per minute using ion mobility spectrometry. Exhaled air was sampled directly from the endotracheal tube and at the ventilator end of the expiratory side of the anesthetic circuit. The circuit was disconnected from the patient and propofol was washed out with a separate clean ventilator. Propofol molecules, which discharged from the expiratory portion of the breathing circuit, were measured for up to 60 h.We also determined whether propofol passes through the plastic of breathing circuits. A total of 984 data pairs (presented as median values, with 95% confidence interval), consisting of both concentrations were collected. The concentration of propofol sampled near the patient was always substantially higher, at 10.4 [10.25–10.55] versus 5.73 [5.66–5.88] ppb (p<0.001). The reduction in concentration over the breathing circuit tubing was 4.58 [4.48–4.68] ppb, 3.46 [3.21–3.73] in the first hour, 4.05 [3.77–4.34] in the second hour, and 4.01 [3.36–4.40] in the third hour. Out-gassing propofol from the breathing circuit remained at 2.8 ppb after 60 h of washing out. Diffusion through the plastic was not observed. Volatile propofol binds or adsorbs to the plastic of a breathing circuit with saturation kinetics. The bond is reversible so propofol can be washed out from the plastic. Our data confirm earlier findings that accurate measurements of volatile propofol require exhaled air to be sampled as close as possible to the patient.
LDMOS transistors in integrated power technologies are often subject to thermo-mechanical stress, which degrades the on-chip metallization and eventually leads to a short. This paper investigates small sense lines embedded in the LDMOS metallization. It will be shown that their resistance depends strongly on the stress cycle number. Thus, they can be used as aging sensors and predict impending failures. Different test structures have been investigated to identify promising layout configurations. Such sensors are key components for resilient systems that adaptively reduce stress to allow aggressive LDMOS scaling without increasing the risk of failure.
Pokémon Go was the first mobile augmented reality (AR) game to reach the top of the download charts of mobile applications. However, little is known about this new generation of mobile online AR games. Existing theories provide limited applicability for user understanding. Against this background, this research provides a comprehensive framework based on uses and gratification theory, technology risk research, and flow theory. The proposed framework aims to explain the drivers of attitudinal and intentional reactions, such as continuance in gaming or willingness to invest money in in-app purchases. A survey among 642 Pokémon Go players provides insights into the psychological drivers of mobile AR games. The results show that hedonic, emotional, and social benefits and social norms drive consumer reactions while physical risks (but not data privacy risks) hinder consumer reactions. However, the importance of these drivers differs depending on the form of user behavior.
Gallium nitride high electron mobility transistors (GaN-HEMTs) have low capacitances and can achieve low switching losses in applications where hard turn-on is required. Low switching losses imply a fast switching; consequently, fast voltage and current transients occur. However, these transients can be limited by package and layout parasitics even for highly optimized systems. Furthermore, a fast switching requires a fast charging of the input capacitance, hence a high gate current.
In this paper, the switching speed limitations of GaN-HEMTs due to the common source inductance and the gate driver supply voltage are discussed. The turn-on behavior of a GaN-HEMT is simulated and the impact of the parasitics and the gate driver supply voltage on the switching losses is described in detail. Furthermore, measurements are performed with an optimized layout for a drain-source voltage of 500 V and a drain-source current up to 60 A.
Decreasing batch sizes in production in line with Industrie 4.0 will lead to tremendous changes of the control of logistic processes in future production systems. Intelligent bins are crucial enablers to establish decentrally controlled material flow systems in value chain networks as well as at the intralogistics level. These intelligent bins have to be integrated into an overall decentralized monitoring and control approach and have to interact with humans and other entities just like other cyber-physical systems (CPS) within the cyber-physical production system (CPPS). To realize a decentralized material supply following the overall aim of a decentralized control of all production and logistics processes, an intelligent bin system is currently developed at the ESB Logistics Learning Factory. This intelligent bin system will be integrated into the self developed, cloud-based and event-oriented SES system (so-called “Self Execution System”) which goes beyond the common functionalities and capabilities of traditional manufacturing execution systems (MES).
To ensure a holistic integration of the intelligent bin for different material types into the SES framework, the required hard- and software components for the decentrally controlled bin system will be split into a common and an adaptable component. The common component represents the localization and network layer which is common for every bin, whereas the flexible component will be customizable to different requirements, like to the specific characteristics of the parts.
Die Arbeit stellt die Möglichkeiten von 3D-Controllern für den Einsatz in der interventionellen Radiologie und insbesondere für die Steuerung der Echtzeit-Magnetresonanztomographie (MRT) dar. Dies ist interessant in Bezug auf die kontrollierte Navigation in ein Zielgewebe. Dabei kann der Interventionalist durch Echtzeit- Bildgebung den Verlauf des Eingriffs verfolgen, allerdings kann er bisher das MRT während der Durchführung des Eingriffs nicht selbst steuern, da dies durch den Assistenten im Nebenraum erfolgt. Die Kommunikation ist bei dem hohen Geräuschpegel aber sehr schwer. Diese Arbeit setzt an dieser Stelle an und analysiert 3D-Controller auf die Eignung für die Echtzeit-Steuerung eines MRTs. Dabei wurden trackingbasierte und trackinglose Geräte betrachtet. Als Ergebnis ließ sich festhalten, dass trackingbasierte Verfahren weniger geeignet sind, aufgrund der nicht ausreichenden Interpretation der Eingaben. Die trackinglosen Geräte hingegen sind aufgrund der korrekten Interpretation aller Eingaben und der intuitiven Bedienung geeignet.
In der Medizin existieren verschiedene Reifegradmodelle, die die Digitalisierung von Krankenhäusern unterstützen können. Die Anforderungen an ein Reifegradmodell für diesen Zweck umfassen Aspekte aus allgemeinen und spezifischen Bereichen des Krankenhauses. Die Analyse der Reifegradmodelle HIN, CCMM, EMRAM und O-EMRAM zeigt große Lücken im Bereich des OP sowie fehlende Aspekte in der Notaufnahme auf. Ein umfassendes Reifegradmodell wurde nicht gefunden. Durch eine Kombination aus HIN und CCMM könnten fast alle Bereiche ausreichend abgedeckt werden. Zusätzliche Ergänzungen durch spezialisierte Reifegradmodelle oder sogar die Entwicklung eines umfassenden Reifegradmodells wären sinnvoll.
Purpose: Medical processes can be modeled using different methods and notations.Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail.
Methods: We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN).
Results: First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention.
Conclusion: An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
In this paper a method for the generation of gSPM with ontology-based generalization was presented. The resulting gSPM was modeled with BPMN/BPMNsix in an efficient way and could be executed with BPMN workflow engines. In the next step the implementation of resource concepts, anatomical structures, and transition probabilities for workflow execution will be realized.
This paper reports an analysis of application and impact of FMEA on susceptibility of generic IT-networks. It is not new that in communication system, the frequency and the data transmission rate play a very important role. The rapid increase in miniaturization of electronic devices leads to very sensitivity against electromagnetic interference. Since the IT network with the data transfer rate makes a huge contribution to this development it is very important to monitor their functionality. Therefore, tests are performed to observe and ensure the data transfer rate of IT networks against IEMI. A fault tree model is presented and observed effects during radiation of disturbance on complex system by a HPEM interference sources are described using a continuous and consistent model of the physical layer to the application layer.
Zukünftige Montagearbeitsplätze müssen veränderten Herausforderungen, wie z. B. der zunehmenden Anzahl von Mensch Roboter-Kollaborationen, gerecht werden. Die Virtual Reality (VR)-Technik bietet im Rahmen der Arbeitsplatzgestaltung neue Möglichkeiten, diesen veränderten Planungsherausforderungen gerecht zu werden. Die Ausarbeitung stellt eine Methode zur Bewertung des sinnvollen Einsatzes der VR-Technik für einen spezifischen Arbeitsplatz vor. Außerdem wird aufgezeigt, wie die VR-Technik in den Prozess der Arbeitsplatzgestaltung integriert werden kann.
Painting galleries typically provide a wealth of data composed of several data types. Those multivariate data are too complex for laymen like museum visitors to first, get an overview about all paintings and to look for specific categories. Finally, the goal is to guide the visitor to a specific painting that he wishes to have a more closer look on. In this paper we describe an interactive visualization tool that first provides such an overview and lets people experiment with the more than 41,000 paintings collected in the web gallery of art. To generate such an interactive tool, our technique is composed of different steps like data handling, algorithmic transformations, visualizations, interactions, and the human user working with the tool with the goal to detect insights in the provided data. We illustrate the usefulness of the visualization tool by applying it to such characteristic data and show how one can get from an overview about all paintings to specific paintings.
Electric freight vehicles have the potential to mitigate local urban road freight transport emissions, but their numbers are still insignificant. Logistics companies often consider electric vehicles as too costly compared to vehicles powered by combustion engines. Research within the body of the current literature suggests that increasing the driven mileage can enhance the competitiveness of electric freight vehicles. In this paper we develop a numeric simulation approach to analyze the cost-optimal balance between a high utilization of medium-duty electric vehicles – which often have low operational costs – and the common requirement that their batteries will need expensive replacements. Our work relies on empirical findings of the real-world energy consumption from a large German field test with medium-duty electric vehicles. Our results suggest that increasing the range to the technical maximum by intermediate (quick) charging and multi-shift usage is not the most cost-efficient strategy in every case. A low daily mileage is more cost-efficient at high energy prices or consumptions, relative to diesel prices or consumptions, or if the battery is not safeguarded by a long warranty. In practical applications our model may help companies to choose the most suitable electric vehicle for the application purpose or the optimal trip length from a given set of options. For policymakers, our analysis provides insights on the relevant parameters that may either reduce the cost gap at lower daily mileages, or increase the utilization of medium-duty electric vehicles, in order to abate the negative impact of urban road freight transport on the environment.
In a time of digital transformation, the ability to quickly and efficiently adapt software systems to changed business requirements becomes more important than ever. Measuring the maintainability of software is therefore crucial for the long-term management of such products. With service-based systems (SBSs) being a very important form of enterprise software, we present a holistic overview of such metrics specifically designed for this type of system, since traditional metrics – e.g. object oriented ones – are not fully applicable in this case. The selected metric candidates from the literature review were mapped to 4 dominant design properties: size, complexity, coupling, and cohesion. Microservice-based systems (μSBSs) emerge as an agile and fine grained variant of SBSs. While the majority of identified metrics are also applicable to this specialization (with some limitations), the large number of services in combination with technological heterogeneity and decentralization of control significantly impacts automatic metric collection in such a system. Our research therefore suggests that specialized tool support is required to guarantee the practical applicability of the presented metrics to μSBSs.
Obwohl Vorteile wertorientierter Preissetzung seit Jahren bekannt sind, gewinnt sie nur langsam an Boden. Die erste Studie des Preisverhaltens nach Geschäftstypen zeigt: 35 Jahre Preisforschung und -beratung konnten die Dominanz der Kostenorientierung erstmals schwächen. Der Autor wagt einen Erklärungsversuch und ermutigt zu mehr Marktorientierung.
It is assumed that more education leads to better understanding of complex systems. Some researchers, however, find indications that simple mechanisms like stocks and flows are not well understood even by people who have passed higher education. In this paper, we test people’s understanding of complex systems with the widely studied stock-and-flow (SF) tasks. SF tasks assess people’s understanding of the interplay between stocks and flows. We investigate SF failure of domain experts and novices in different knowledge domains. In particular, we compare performance on the original study’s bathtub task with the square wave pattern with two alternative cover stories from the engineering and business domains on different groups of business and engineering students from different semesters. Further, we show that, while engineering students perform better than business students, with progressing in higher education, students may lose the capability of dealing with simple SF tasks. We thus find hints on déformation professionelle in higher education.
Eine realistische Risikoeinschätzung ist Basis von verantwortungsvollen Unternehmensentscheidungen. Doch wie lassen sich Risiken richtig einschätzen? Verschiedene Instrumente des Risiko-Managements erlauben es, Risiken systematisch zu identifizieren, zu quantifizieren, zu bewerten und zu dokumentieren.
Best Practice-Modelle und Change- Weisheiten erfreuen sich großer Beliebtheit, was sich wohl damit erklären lässt, dass sie bei den Verantwortlichen komplexitätsreduzierend wirken und Unsicherheit abbauen. Allerdings sind Organisationen voller Widersprüche, reagieren oft irrational und folgen nicht unbedingt den durchgeplanten Entwürfen des Change Managements. Manche Fragen in Organisationen sind unlösbar, und die Organisationen pendeln bei ihrer Lösungssuche zwischen entgegengesetzten Polen hin und her. Best Practices können als Idealvorstellungen die in sie gesetzten Erwartungen oft nicht erfüllen. Dort, wo sie an ihre Grenzen stoßen, erscheint es ratsam, sich auf die fundamentalen Kräfte des Wandels wie Paradoxien, Ambiguität, Komplexität und Nicht-Steuerbarkeit einzulassen.
The main challenge when driving heat pumps by PV-electricity is balancing differing electrical and thermal demands. In this article, a heuristic method for optimal operation of a heat pump driven by a maximum share of PV-electricity is presented. For this purpose, the (DHW) are activated in order shift the operation of the heat pump to times of PV-generation. The system under consideration refers to thermal and electrical demands of a single family house. It consists of a heat pump, a thermal energy storage for DHW and of grid connected heating and generation of domestic hot water, the heat pump runs with two different supply temperatures and thereby achieving a maximum overall COP. Within the algorithm for optimization a set of heuristic rules is developed in a way that the operational characteristics of the heat pump in terms of minimum running and stopping times are met as well as the limiting constraints of upper and lower limits of room temperature and energy content of electricity generated, a varying number of heat pump schedules fulfilling the bundary conditions are created. Finally, the schedule offering the maximum on-site utilization of PV-electricity with a minimum number of starts of the heat pump, which serves as secondary condition, is selected. Yearly simulations of this combination have been carried out. Initial results of this method indicate a significant rise in on-site consumption of the PV-electricity and heating demand fulfilment by renewable electricity with no need for a massive TES for the heating system in terms of a big water tank.
This paper investigates the impact of dynamic capabilities (DC) on brand love. From a resource-based view, there is little clarity vis-à-vis the specific capabilities that drive the ability to create brand love. This paper focuses on three research questions: Firstly, which dynamic capabilities are relevant for brand love? Secondly, how strong is the impact of certain dynamic capabilities on brand love? Thirdly, which conditions mediate and moderate the impact of specific dynamic capabilities on brand love? Data from a multi-method research approach have been used to itentify the specific capabilities that corporations need, to enhance brand love. Furthermore, a standardized online survey was conducted on marketing executives and evaluated by structural equation modeling. The results indicate, that customer expertise plays a major role in the relationship between dynamic capabilities and brand love. Furthermore, this relationship is more important in markets that have a low competitive differentiation in products and services.
Propofol is a commonly used intravenous general anesthetic. Multi-capillary column (MCC) coupled ion-mobility spectrometry (IMS) can be used to quantify exhaled propofol, and thus estimate plasma drug concentration. Here, we present results of the calibration and analytical validation of a MCC/IMS pre-market prototype for propofol quantification in exhaled air.
Close and safe interaction of humans and robots in joint production environments is technically feasible, however should not be implemented as an end in itself but to deliver improvement in any of a production system’s target dimensions. Firstly, this paper shows that an essential challenge for system integrators during the design of HRC applications is to identify a suitable distribution of available tasks between a robotic and a human resource. Secondly, it proposes an approach to determine task allocation by considering the actual capabilities of both human and robot in order to improve work quality. It matches those capabilities with given requirements of a certain task in order to identify the maximum congruence as the basis for the allocation decision. The approach is based on a study and subsequent generic description of human and robotic capabilities as well as a heuristic procedure that facilities the decision making process.
The purpose of this paper is to investigate the use of sustainable closed-loop supply chain of the fashion brand Filippa K. Information on green fashion has been gathered and a case study approach on the fashion retailer
Filippa K conducted. Results show a switch in knowledge content between a fast fashion supply chain and a sustainable supply chain. Also there is an evolution in sustainability as companies, retailers, and manufactures suffer under pressure from the customers, governments, and the media. Sustainable fashion brands like Filippa K are interested in sharing precise knowledge on variety of aspects linked to the sustainable closed-loop supply chain. This research paper has been limited by less information and unexplored topics in the theme green fashion. This led to the personal critical disputation with the brand Filippa K.
Since there is no denying that transparency is increasingly central to corporate sustainability, the purpose of this paper is a case study on a company’s attempt to be fully transparent, hence, picking up the existent scholarly conversation about uncompromising supply chain transparency. Literature so far was found to be fairly limited, but, following a trend, has been rising in numbers over recent years. Addressing these shortcomings in the methodology, an in-depth literature review about the multiple dimensions of supply chain transparency has been performed and links within supply networks stressed. On this basis, a case study by exemplary illustrating the fashion label Honestby has been drafted and the effort to become the world’s first 100 % transparent company further examined. Findings are discussed whether more supply chain transparency is desirable in any case, obstacles listed and an outlook for this kind of business model has been drawn. The research is clearly limited by the amount of scholarly literature concerning Honestby in particular. Out of this reason, magazines and journal entries are used as reference as well. Only with the extension of the topic itself to supply chain transparency and the literature review beforehand, the paper gained its necessary academic standard. Concerning implications, it needs to be mentioned that even though Honest by demonstrates to be fully transparent, it was not possible to find any public information about the degree of supplier relationship. In particular, concerning the applied control mechanisms used to exert influence and to balance out the power gradient between company and suppliers.
Characterisation of porous knitted titanium for replacement of intervertebral disc nucleus pulposus
(2017)
Effective restoration of human intervertebral disc degeneration is challenged by numerous limitations of the currently available spinal fusion and arthroplasty treatment strategies. Consequently, use of artificial biomaterial implant is gaining attention as a potential therapeutic strategy. Our study is aimed at investigating and characterizing a novel knitted titanium (Ti6Al4V) implant for the replacement of nucleus pulposus to treat early stages of chronic intervertebral disc degeneration. Specific knitted geometry of the scaffold with a porosity of 67.67 ± 0.824% was used to overcome tissue integration failures. Furthermore, to improve the wear resistance without impairing original mechanical strength, electro-polishing step was employed. Electro-polishing treatment changed a surface roughness from 15.22 ± 3.28 to 4.35 ± 0.87 μm without affecting its wettability which remained at 81.03 ± 8.5°. Subsequently, cellular responses of human mesenchymal stem cells (SCP1 cell line) and human primary chondrocytes were investigated which showed positive responses in terms of adherence and viability. Surface wettability was further enhanced to super hydrophilic nature by oxygen plasma treatment, which eventually caused substantial increase in the proliferation of SCP1 cells and primary chondrocytes. Our study implies that owing to scaffolds physicochemical and biocompatible properties, it could improve the clinical performance of nucleus pulposus replacement.
To analyze the humans’ sleep it is necessary as to identify the sleep stages, occurring during the sleep, their durations and sleep cycles. The gold standard procedure for this approach is polysomnography (PSG), which classify the sleep stages based on Rechtschaffen and Kales (R-K) method. This method aside the advantages as high accuracy has however some disadvantages, among others time-consuming and uncomfortable for the patient procedure. Therefore, the development of further methods for the sleep classification in addition to PSG is a promising topic for the investigation and this work has as its aim the presentation of possible ways and goals for this development.
In vitro cultured cells produce a complex extracellular matrix (ECM) that remains intact after decellularization. The biological complexity derived from the variety of distinct ECM molecules makes these matrices ideal candidates for biomaterials. Biomaterials with the ability to guide cell function are a topic of high interest in biomaterial development. However, these matrices lack specific addressable functional groups, which are often required for their use as a biomaterial. Due to the biological complexity of the cell-derived ECM, it is a challenge to incorporate such functional groups without affecting the integrity of the biomolecules within the ECM. The azide-alkyne cycloaddition (click reaction, Huisgen-reaction) is an efficient and specific ligation reaction that is known to be biocompatible when strained alkynes are used to avoid the use of copper (I) as a catalyst. In our work, the ubiquitous modification of a fibroblast cell-derived ECM with azides was achieved through metabolic oligosaccharide engineering by adding the azide-modified monosaccharide Ac4GalNAz (1,3,4,6 tetra-O-acetyl-N-azidoacetylgalactosamine) to the cell culture medium. The resulting azide-modified network remained intact after removing the cells by lysis and the molecular structure of the ECM proteins was unimpaired after a gentle homogenization process. The biological composition was characterized in order to show that the functionalization does not impair the complexity and integrity of the ECM. The azides within this ‘‘clickECM” could be accessed by small molecules (such as an alkyne modified fluorophore) or by surface-bound cyclooctynes to achieve a covalent coating with clickECM.
The purpose of this paper is to investigate how the practice of closed-loop production systems (CLPS) is implemented in the fashion industry. This paper offers a critical literature review to present a thorough understanding of the actual status of literature. Subsequently, the paper reveals that CLPS are of great importance. Generally, such systems include different activities that have to be integrated. Critical points are the product acquisition, the recovering process itself and the remarketing to the customer. A lack of reliable data concerning CLPS in the specific case of fashion industry can be identified. Important research fields could be marketing strategies, controlling the acquisition process, evolvement of return technologies and strategies, adaption of recovered products to the mass market, and the development of new technologies concerning recovering processes.
The purpose of this paper is to evaluate consisting consumption patterns caused by fast fashion with a new appearing form of consumption and retaining potentials as an alternative as well as sustainable form of fast fashion consumption. This research is set up on a theoretical background of scientific literature including governmental as well as press releases in order to evaluate the status quo of consumption and answering the research question. A new consumption pattern as well as an appearing economy of sharing can be stated including potential aspects of raising businesses and sustainable alternative forms of fast fashion. The framework of the research is limited to the textile and fashion industry in industrialized countries focusing on consumption in the twenty first century.
Comments on “Solubility parameter of chitin and chitosan”, Carbohydrate Polymers 36 (1998) 121–127
(2017)
Results on the solubility parameters of chitin and chitosan presented in the paper DOI: 10.1016/S0144-8617(98)00020-4 were recalculated and data evaluation was redone. A number of misprints, erroneous calculations and data evaluations were found with respect to Hansen as well as total solubility parameters as derived according to group contribution methods by Hoftyzer-Van Krevelen and Hoy’s system. Revised numerical data are presented.
The influence of turbidity on the Raman signal strengths of condensed matter is theoretically analyzed and measured with laboratory - scale equipment for remote sensing. The results show the quantitative dependence of back- and forward-scattered signals on the thickness and elastic-scattering properties of matter. In the extreme situation of thin, highly turbid layers, the measured Raman signal strengths exceed their transparent analogs by more than a factor of ten. The opposite behavior is found for thick layers of low turbidity, where the presence of a small amount of scatterers leads to a decrease of the measured signal. The wide range of turbidities appearing in nature is experimentally realized with stacked polymer layers and solid/liquid dispersions, and theoretically modeled by the equation of radiative transfer using the analytical diffusion approximation or random walk simulations.
With the Internet of Things being one of the most discussed trends in the computer world lately, many organizations find themselves struggling with the great paradigm shift and thus the implementation of IoT on a strategic level. The Ignite methodoogy as a part of the Enterprise-IoT project promises to support organizations with these strategic issues as it combines best practices with expert knowledge from diverse industries helping to create a better understanding of how to transform into an IoT driven business. A framework that is introduced within the context of IoT business model development is the Bosch IoT Business Model Builder. In this study the provided framework is compared to the Osterwalder Business Model Canvas and the St. Gallen Business Model Navigator, the most commonly used and referenced frameworks according to a quantitative literature analysis.
This paper describes a new method for condition monitoring of a roller chain. In contrast to conventional methods, no additional accelerometers are used to measure and interpret frequency spectra but the chain condition is evaluated using an easy to interpret similarity measure based on correlation functions using the driving motor torque. An additional clustering of current data and reference measurements yields an easy to understand representation of the chain condition.
The business landscape is changing radically because of software. Companies in all industry sectors are continously finding new flexibilities in this programmable world. They are able to deliver new functionalities even after the product is already in the customer's hands. But success is far from guaranteed if they cannot validate their assumptions about what their customers actually need. A competitor with better knowledge of customer needs can disrupt the market in an instant.
This book introduces continuous experimentation, an approach to continuously and systematically test assumptions about the company's product or service strategy and verify customers' needs through experiments. By observing how customers actually use the product or early versions of it, companies can make better development decisions and avoid potentially expensive and wasteful activities. The book explains the cycle of continuous experimentation, demonstrates its use through industry cases, provides advice on how to conduct experiments with recipes, tools, and models, and lists some common pitfalls to avoid. Use it to get started with continuous experimentation and make better product and service development decisions that are in-line with your customers' needs.
Condition Monitoring for mechanical systems like bearings or transmissions is often done by analysing frequency spectra obtained from accelerometers mounted to the components under observation. Although this approach gives a high amount on information about the system behaviour, the interpretation of the resulting spectra requires expert knowledge, that is, a deep understanding of the effect on condition deterioration on the measured spectra. However, an increasing number of condition monitoring applications demands other representations of the measured signals that can be easily interpreted even by non–experts. Therefore, the objective of this paper is to develop an approach for processing measured process data in order to obtain an easy to interpret measure for assessing the component condition. The main idea is to evaluate the deterioration of a component condition by computing the correlation function of current measurements with past measurements in order to detect a component condition deterioration from a change in these correlation functions. Besides the simplicity of the obtained measure, this approach opens the opportunity for integrating a model based approach as well. The developed method is tested based on a condition monitoring application in a roller chain.
The purpose of this research is to explore current boundaries of the fashion industry’s second hand market and which solutions and approaches can be adopted from the used-car industry. The paper is based on the study of existing literature which deals with sustainability in combination with second hand markets in general and adaptable features of the used-car industry. Adaptable features are found using the business model canvas. The key finding of this study indicates that the fashion industry faces immense social and environmental challenges which can be partly solved by the development of the second hand market. Used-car industry can be seen as role model for fashion retail. In this study only aspects of used-car distribution are highlighted; therefore, characteristics of the recycling of used cars are not examined.
The diversity of energy prosumer types makes it difficult to create appropriate incentive mechanisms that satisfy both prosumers and energy system operators alike. Meanwhile, European energy suppliers buy guarantees of origin (GoO) which allow them to sell green energy at premium prices while in reality delivering grey energy to their customers. Blockchain technology has proven itself to be a robust paying system in which users transact money without the involvement of a third party. Blockchain tokens can be used to represent a unit of energy and, just as GoOs, be submitted to the market. This paper focuses on simulating marketplace using the ethereum blockchain and smart contracts, where prosumers can sell tokenized GoOs to consumers willing to subsidize renewable energy producers. Such markets bypass energy providers by allowing consumers to obtain tokenized GoOs directly from the producers, which in turn benefit directly from the earnings. Two market strategies where tokens are sold as GoOs have been simulated. In the Fix Price Strategy prosumers sell their tokens to the average GoO price of 2014. The Variable Price Strategy focuses on selling tokens at a price range defined by the difference between grey and green energy. The study finds that the ethereum blockchain is robust enough to functions as a platform for tokenized GoO trading. Simulation results have been compared and the results indicate that prosumers earn significantly more money by following the Variable Price
Strategy.
Purpose: The purpose of this paper is to examine the service of the new business model Curated Shopping in the fashion industry as well as to analyze if the service provides a higher costumer added value in comparison to traditional services in retail stores and e-commerce platforms. It gives implications to curated shop operators how to optimize the service in each stage of the customer buying process.
Design/methodology/approach: The research methodology applied is an empirical study that uses the principal of mystery shopping in order to investigate the provided services during the selling process.
Findings: The study showed that information about the customer should be collected carefully and as holistic as possible in order to assort a suitable outfit. The consumer is able to benefit from the service by saving time and enjoying a stress-free way of shopping. Nevertheless there are limitations in the personal service to give individual and inspiring advice by the curator caused by the physical distance to the customer.
Research limitations: The survey was conducted under 10 mystery shoppers and 4 curated shop operators in Germany, limiting findings to these mystery shoppers and operators.
Practical implications: One implication for the shop operators is to collect consumer information carefully and expand the assortment and brand portfolio in order to provide fashion goods to inspire the consumer. The shop operators are on the right track still there is huge potential to provide a more shopper-oriented service.
Electronic word-of-mouth (eWoM) communication has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM.
Curriculum design for the German language class in the double-degree programme business engineering
(2017)
This paper aims to give an overview on how German is taught as a foreign language to students enrolled in the Bachelor of Business Engineering, a double-degree programme offered in Universiti Malaysia Pahang. The double degree students have the opportunity to complete their first two years of study in Malaysia and their last two years in Germany. Taking the TestDaF examination is compulsory for double-degree students. Hence, the German Language curriculum has been meticulously planned to ensure the students would be competent in the language. As such, the settings of the language class are discussed thoroughly in this paper. Additionally, it also discusses the challenges faced in teaching German as foreign language. This paper ends with some suggestions for improvement.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
The Ninth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2017), held between May 21 - 25, 2017 - Barcelona, pain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Digitization fosters the development of IT environments with many rather small structures, like Internet of Things (IoT), microservices, or mobility systems. They are needed to support flexible and agile digitized products and services. The goal is to create service-oriented enterprise architectures (EA) that are self optimizing and resilient. The present research paper investigates methods for decision-making concerning digitization architectures for Internet of Things and microservices. They are based on evolving enterprise architecture reference models and state of the art elements for architectural engineering for microgranular systems. Decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures, is sorely needed. The challenging of the decision processes can be supported with in a more flexible and intuitive way by an architecture management cockpit.
Mittlerweile ist der Einsatz von technischen Hilfsmitteln zu Analysezwecken im Sport fester Bestandteil im Trainingsalltag von Trainern und Athleten. In nahezu jeder Sportart werden Videoaufzeichnungen genutzt, um die Bewegungsausführung zu dokumentieren und zu analysieren. Allerdings reichen Aufnahmen von einem statischen Standort oftmals nicht mehr aus. An dieser Stelle kann Virtual Reality (VR) eine Lösung dieses Problems bieten. Durch VR kann der aufgezeichneten Szene eine weitere Ebene hinzugefügt und die Bewegungsabläufe neu und detaillierter bewertet werden. Um Bewegungen in einer virtuellen Umgebung abzubilden, müssen diese mittels Motion Capturing (MoCap) aufgezeichnet werden. Ziel dieser Arbeit ist es, herauszufinden, ob das MoCap System Perception Neuron in der Lage ist, Bewegungen in hoher Geschwindigkeit zu erfassen.
In this paper we describe the design and development process of an electromagnetic picker for rivets. These rivets are used in a production process of leather or textile design objects like riveted waist belts or purses. The picker is designed such that it replaces conventional mechanical pickers thus avoiding mechanical wear problems and increasing the process quality. The paper illustrates the challenges in the design process of this mechatronic system. The design process was based on both simulation and experiments leading to a prototype that satisfies the requirements.
In this study, a novel strategy has been developed for the assembly of polyelectrolyte multilayer (PEM) on CaCO3 templates in acidic pH solutions, where consecutive polyelectrolyte layers (heparin/poly(allylamine hydrochloride) or heparin/chitosan) were deposited on PEM hollow microcapsules established previously on CaCO3 templates. The PEM build-up, hollow capsule characterization and successful encapsulation of fluorescein 5(6)-isothiocyanate (FITC)-Dextran by coprecipitation with CaCO3 are demonstrated. Improvement by the removal of CaCO3 core was achieved while the depositions. In the course of the release profile, high retardation for encapsulated FITC-Dextran was observed. The combined shell capsules system is a significant trait that has potential use in tailoring functional layer-by-layer capsules as intelligent drug delivery vehicles where the preliminary in vitro tests showed the responsiveness on the enzymes.
Clinical reading centers provide expertise for consistent, centralized analysis of medical data gathered in a distributed context. Accordingly, appropriate software solutions are required for the involved communication and data management processes. In this work, an analysis of general requirements and essential architectural and software design considerations for reading center information systems is provided. The identified patterns have been applied to the implementation of the reading center platform which is currently operated at the Center of Ophthalmology of the University Hospital of Tübingen.
The modern industrial corporation encompasses a myriad of different software applications, each of which must work in concert to deliver functionality to end-users. However, the increasingly complex and dynamic nature of competition in today’s product-markets dictates that this software portfolio be continually evolved and adapted, in order to meet new business challenges. This ability – to rapidly update, improve, remove, replace, and reimagine the software applications that underpin a firm’s competitive position – is at the heart of what has been called IT agility. Unfortunately, little work has examined the antecedents of IT agility, with respect to the choices a firm makes when designing its “Software Portfolio Architecture.”
We address this gap in the literature by exploring the relationship between software portfolio architecture and IT agility at the level of the individual applications in the architecture. In particular, we draw from modular systems theory to develop a series of hypotheses about how different types of coupling impact the ability to update, remove or replace the software applications in a firm’s portfolio. We test our hypotheses using longitudinal data from a large financial services firm, comprising over 1,000 applications and over 3,000 dependencies between them. Our methods allow us to disentangle the effects of different types and levels of coupling.
Our analysis reveals that applications with higher levels of coupling cost more to update, are harder to remove, and are harder to replace, than those with lower coupling. The measures of coupling that best explain differences in IT agility include all indirect dependencies between software applications (i.e., they include coupling and dependency relationships that are not easily visible to the system architect). Our results reveal the critical importance of software portfolio design decisions, in developing a portfolio of applications that can evolve and adapt over time.
Technologies for mapping the “digital twin“ have been under development for approximately 20 years. Nowadays increasingly intelligent, individualized products encourages companies to respond innovatively to customer requirements and to handle the rising product variations quickly.
An integrated engineering network, spanning across the entire value chain, is operated to intelligently connect various company divisions, and to generate a business ecosystem for products, services and communities. The conditions for the digital twin are thereby determined in which the digital world can be fed into the real, and the real world back into the digital to deal such intelligent products with rising variations.
The term digital twin can be described as a digital copy of a real factory, machine, worker etc., that is created and can be independently expanded, automatically updated as well as being globally available in real time. Every real product and production site is permanently accompanied by a digital twin. First prototypes of such digital twins already exist in the ESB Logistics Learning Factory on a cloud- and app based software that builds on a dynamic, multidimensional data and information model. A standardized language of the robot control systems via software agents and positioning systems has to be integrated. The aspect of the continuity of the real factory in the digital factory as an economical means of ensuring continuous actuality of digital models looks as the basis of changeability.
For the indoor localization sensor combinations that in addition to the hardware already contain the software required for the sensor data fusion should be used. Processing systems, scenario-live-simulations and digital shop floor management results in a mandatory procedural combination. Essential to the digital twin is the ability to consistently provide all subsystems with the latest state of all required information, methods and algorithms.
Digitization will require companies to fundamentally reengineer their sales processes. Adapting the concept of value selling to the digital age will enable them to deliver superior value to their customers. Specifically, social selling will provide them with an answer to the ever-increasing complexity of customer journeys. This article, based on a survey among 235 German companies, assesses the status quo and outlines opportunities. Moreover, it introduces a novel approach for developing well-grounded social selling metrics.
Digitization in the energy sector is a necessity to enable energy savings and energy efficiency potentials. Managing decentralized corporate energy systems is hindered by a non-existence. The required integration of energy objectives into business strategy creates difficulties resulting in inefficient decisions. To improve this, practice-proven methods such as Balanced Scorecard, Enterprise Architecture Management and the Value Network approach are transferred to the energy domain. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
Recent digital technologies like the Internet of Things and Augmented Reality have brought IT into companies’ core products. What were previously purely physical products are becoming hybrid or digitized. Despite receiving a lot of recent attention, digitized products have only seen a slow uptake in businesses so far. In this paper, we study the challenges that keep companies from realizing the desired impacts of digitized products and the practices they employ to address these challenges. To do so, we looked at companies from a set of industries that are highly affected by digital transformation, but at the same time hesitant to move to a more digitized world: the creative industries. Based on a literature review and twelve interviews in creative industries, we developed a conceptual model that can serve as a basis for formulating testable hypotheses for further research in this area.
Cell-cell and cell-extracellular matrix (ECM) adhesion regulates fundamental cellular functions and is crucial for cell-material contact. Adhesion is influenced by many factors like affinity and specificity of the receptor-ligand interaction or overall ligand concentration and density. To investigate molecular details of cell ECM and cadherins (cell-cell) interaction in vascular cells functional nanostructured surfaces were used Ligand-functionalized gold nanoparticles (AuNPs) with 6-8 nm diameter, are precisely immobilized on a surface and separated by non-adhesive regions so that individual integrins or cadherins can specifically interact with the ligands on the AuNPs. Using 40 nm and 90 nm distances between the AuNPs and functionalized either with peptide motifs of the extracellular matrix (RGD or REDV) or vascular endothelial cadherins (VEC), the influence of distance and ligand specificity on spreading and adhesion of endothelial cells (ECs) and smooth muscle cells (SMCs) was investigated. We demonstrate that RGD-dependent adhesion of vascular cells is similar to other cell types and that the distance dependence for integrin binding to ECM-peptides is also valid for the REDV motif. VEC-ligands decrease adhesion significantly on the tested ligand distances. These results may be helpful for future improvements in vascular tissue engineering and for development of implant surfaces.
The purpose of this paper is to study the impact of transparency on the political budget cycle (PBC) over time and across countries. So far, the literature on electoral cycles finds evidence that cycles depend on the stage of an economy. However, the author shows – for the first time – a reliance of the budget cycle on transparency. The author uses a new data set consisting of 99 developing and 34 Organization for Economic Cooperation and Development countries. First, the author develops a model and demonstrates that transparency mitigates the political cycles. Second, the author confirms the proposition through the econometric assessment. The author uses time series data from 1970 to 2014 and discovers smaller cycles in countries with higher transparency, especially G8 countries.
In retail environments, consumers commonly evaluate products while standing on some type of flooring and concurrently being exposed to music; however, no study has examined the interaction of these two atmospheric cues. To bridge this gap, this research examines whether retailers can benefit from creating multisensory atmospheric congruent rather than incongruent retail environments of flooring and music. The results of an experiment in a real retail store reveal positive effects of multisensory congruent retail environments (e.g., soft music combined with soft flooring) on product evaluations. This study provides a new process explanation with consumers’ purchase-related self-confidence mediating these effects. Specifically, consumers in congruent rather than incongruent retail environments experience more purchase-related self confidence, which in turn leads to more favorable product evaluations. Furthermore, this study shows that consumers with a low rather than a high preference for haptic information are influenced more by multisensory atmospheric congruence when evaluating a product haptically.
In 2016, German car manufacturer the Audi Group (AUDI AG) was working on an expanding array of digital innovations. The goals of these innovations varied, and included strengthening customer- and employee-facing processes, digitally enhancing existing products, and developing new, potentially disruptive business models. Audi’s IT unit was critical to each of these efforts. Based on personal interviews with 11 IT- and non-IT executives at Audi, this case examines the different ways in which digitization can help to enhance and transform an organization’s processes, products, and business models. The case also highlights the challenges that arise as large companies “digitize.”
In this article, liposome-based coatings aiming to control drug release from therapeutic soft contact lenses (SCLs) materials are analyzed. A PHEMA based hydrogel material loaded with levofloxacin is used as model system for this research. The coatings are formed by polyelectrolyte layers containing liposomes of 1,2-dimyristoyl-sn-glycero-3- phosphocholine (DMPC) and DMPC1cholesterol (DMPC1 CHOL). The effect of friction and temperature on the drug release is investigated. The aim of the friction tests is to simulate the blinking of the eyelid in order to verify if the SCLs materials coated with liposomes are able to keep their properties, in particular the drug release ability. It was observed that under the study conditions, friction did not affect significantly the drug release from the liposome coated PHEMA material. In contrast, increasing the temperature of release leads to an increase of the drug diffusion rate through the hydrogel. This phenomenon is recorded both in the control and in the coated samples.
A concept for a slope shaping gate driver IC is proposed, used to establish control over the slew rates of current and voltage during the turn-on and turn off switching transients.
It combines the high speed and linearity of a fully-integrated closed-loop analog gate driver, which is able to perform real-time regulation, with the advantages of digital control, like flexibility and parameter independency, operating in a predictive cycle-bycycle regulation. In this work, the analog gate drive integrated circuit is partitioned into functional blocks and modeled in the small-signal domain, which also includes the non-linearity of parameters. An analytical stability analysis has been performed in order to ensure full functionality of the system controlling a modern generation IGBT and a superjunction MOSFET. Major parameters of influence, such as gate resistor and summing node capacitance, are investigated to achieve stable control. The large-signal behavior, investigated by simulations of a transistor level design, verifies the correct operation of the circuit. Hence, the gate driver can be designed for robust operation.
EBIT & Co.
(2017)
Eine ganze Reihe von Kennzahlen wird in der Betriebswirtschaftslehre zur Ermittlung und Steuerung des Unternehmensgewinns verwendet. Doch nicht alle eignen sich für denselben Zweck. Je nach Fragestellung sollten unterschiedliche Kennzahlen herangezogen werden. Ihre Interpretation muss nicht zuletzt auch branchenspezifisch erfolgen.
In this paper we build on our research in data management on native Flash storage. In particular we demonstrate the advantages of intelligent data placement strategies. To effectively manage phsical Flash space and organize the data on it, we utilize novel storage structures such as regions and groups. These are coupled to common DBMS logical structures, thus require no extra overhead for the DBA. The experimental results indicate an improvement of up to 2x, which doubles the longevity of Flash SSD. During the demonstration the audience can experience the advantages of the proposed approach on real Flash hardware.
Integrated power semiconductors are often used for applications with cyclic on-chip power dissipation. This leads to repetitive self-heating and thermo-mechanical stress, causing fatigue on the on-chip metallization and possibly destruction by short circuits. Because of this, an accurate simulation of the thermo-mechanical stress is needed already during the design phase to ensure that lifetime requirements are met. However, a detailed thermo mechanical simulation of the device, including the on-chip metallization is prohibitively time-consuming due to its complex structure, typically consisting of many thin metal lines with thousands of vias. This paper introduces a two-step approach as a solution for this problem. First, a simplified but fast simulation is performed to identify the device parts with the highest stress. After, precise simulations are carried out only for them. The applicability of this method is verified experimentally for LDMOS transistors with different metal configurations. The measured lifetimes and failure locations correlate well with the simulations. Moreover, a strong influence of the layout of the on-chip metallization lifetime was observed. This could also be explained with the simulation
method.
In vitro composed vascularized adipose tissue is and will continue to be in great demand e.g. for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Up to date, the lack of adequate culture conditions, mainly a culture medium, decelerates further achievements. In our study, we evaluated the influence of epidermal growth factor (EGF) and hydrocortisone (HC), often supplemented in endothelial cell (EC) specific media, on the co-culture of adipogenic differentiated adipose derived stem cells (ASCs) and microvascular endothelial cells (mvECs). In ASCs, EGF and HC are thought to inhibit adipogenic differentiation and have lipolytic activities. Our results showed that in indirect co-culture for 14 days, adipogenic differentiated ASCs further incorporated lipids and partly gained an univacuolar morphology when kept in media with low levels of EGF and HC. In media with high EGF and HC levels, cells did not incorporate further lipids, on the contrary, cells without lipid droplets appeared. Glycerol release, to measure lipolysis, also increased with elevated amounts of EGF and HC in the culture medium. Adipogenic differentiated ASCs were able to release leptin in all setups. MvECs were functional and expressed the cell specific markers, CD31 and von Willebrand factor (vWF), independent of the EGF and HC content as long as further EC specific factors were present. Taken together, our study demonstrates that adipogenic differentiated ASCs can be successfully co-cultured with mvECs in a culture medium containing low or no amounts of EGF and HC, as long as further endothelial cell and adipocyte specific factors are available.
Es wird gezeigt, wie bei Fernspeisung die Vorhersage der Erwärmung mit entsprechender Modellierung verbessert werden kann und wie der Einfluss von Material und Form des Kabelkanals die Erwärmung und das das Temperaturprofil des Bündels beeinflusst. Es wird auch vorgestellt, dass die erhöhte Erwärmung von Metallkabelkanälen auf die geringere Emissivität zurückzuführen ist und wie das verbessert werden kann.
This paper presents an approach for label-free brain tumor tissue typing. For this application, our dual modality microspectroscopy system combines inelastic Raman scattering spectroscopy and Mie elastic light scattering spectroscopy. The system enables marker-free biomedical diagnostics and records both the chemical and morphologic changes of tissues on a cellular and subcellular level. The system setup is described and the suitability for measuring morphologic features is investigated.
Electronic word-of-mouth (eWoM) communication plays an increasingly important role in modern business. The underlying concept of word-of-mouth (WoM) communication is well researched and has proved highly significant in respect of its impact on customers purchase behavior. However, due to the advent of digital technologies, decision-making among customers is progressively shifting to the online world. Consequently, eWoM has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM. Five main research areas were analyzed, supporting the need for further eWoM studies and providing a structured overview of existing results.
We present a topology of MIMO arrays of inductive antennas exhibiting inherent high crosstalk cancellation capabilities. A single layer PCB is etched into a 3-channels array of emitting/receiving antennas. Once coupled with another similar 3-channels emitter/receiver, we measured an Adjacent Channel Rejection Ratio (ACRR) as high as 70 dB from 150 Hz to 150 kHz. Another primitive device made out of copper wires wound around PVC tubes to form a 2-channels “non-contact slip-ring” exhibited 22 dB to 47 dB of ACRR up to 15MHz. In this paper we introduce the underlying theoretical model behind the crosstalk suppression capabilities of those so-called “Pie-Chart antennas”: an extension of the mutual inductance compensation method to higher number of channels using symmetries. We detail the simple iterative building process of those antennas, illustrate it with numerical analysis and evaluate there effectiveness via real experiments on the 3-channels PCB array and the 2-channels rotary array up to the limit of our test setup. The Pie Chart design is primarily intended as an alternative solution to costly electronic filters or cumbersome EM shields in wireless AND wired applications, but not exclusively.
Nowadays CHP units are discussed for the production of electricity on demand rather than for generation of heat providing electricity as a by-product. By this means, CHP units are capable of satisfying a higher share of the electricity demand on-site and in this new role, CHP units are able to reduce the load on the power grid and to compensate for high fluctuations of solar and wind power.
Evidently, a novel control strategy for CHP units is required in order to shift the operation oriented at the heat demand to an operation led by the electricity demand. Nevertheless, the heat generated by the CHP unit needs to be utilized completely in any case, for maintaining energy as well as economic efficiency. Such a strategy has been developed at Reutlingen University, and it will be presented in the paper. Part of the strategy is an intelligent management for the thermal energy storage (TES) ensuring that the storage is at low level in terms of its heat content just before an electricity demand is calling the CHP unit into operation. Moreover, a proper forecast of both, heat and electricity demand, is incorporated and the requirements of the CHP unit in terms of maintenance and lifetime are considered by limiting the number of starts and stops per unit time and by maintaining a certain minimum length of the operation intervals.
All aspects of this novel control strategy are revealed in the paper, which has been implemented on a controller for further testing at two sites in the field. Results from these tests are given as well as results from a simulation model, which is able to evaluate the performance of the control strategy for an entire year.
Energy transfer kinetics in photosynthesis as an inspiration for improving organic solar cells
(2017)
Clues to designing highly efficient organic solar cells may lie in understanding the architecture of light harvesting systems and exciton energy transfer (EET) processes in very efficient photosynthetic organisms. Here, we compare the kinetics of excitation energy tunnelling from the intact phycobilisome (PBS) light harvesting antenna system to the reaction center in photosystem II in intact cells of the cyanobacterium Acaryochloris marina with the charge transfer after conversion of photons into photocurrent in vertically aligned carbon nanotube (va- CNT) organic solar cells with poly(3-hexyl)thiophene (P3HT) as the pigment. We find that the kinetics in electron hole creation following excitation at 600 nm in both PBS and va-CNT solar cells to be 450 and 500 fs, respectively. The EET process has a 3 and 14 ps pathway in the PBS, while in va-CNT solar cell devices, the charge trapping in the CNT takes 11 and 258 ps. We show that the main hindrance to efficiency of va CNT organic solar cells is the slow migration of the charges after exciton formation.
Entwicklung eines nicht vergilbenden, faserbasierten BH's mittels innovativer FIM-Technologie
(2017)
Im Rahmen des Forschungsprojektes sollten die Möglichkeiten und Grenzen des Einsatzes von Sol-Gel-Ausrüstung für die Verbesserung der Scheuer-/Abrasionsbeständigkeit für Gewebe aus unterschiedlichen Fasermaterialien untersucht werden. Dabei lag der Schwerpunkt auf Textilien für die Bereiche Bekleidung- /Berufsbekleidung sowie Bezugsstoffe (Möbel, Automotive, Personentransport).
Ein stark erforschtes Gebiet der Computer Vision ist die Detektion von markanten Punkten des Gesichtszuges (englisch: facial feature detection), wie der Mundwinkel oder des Kinns. Daher lassen sich eine Vielzahl von veröffentlichten Verfahren finden, die sich jedoch teils deutlich hinsichtlich der Detektionsgenauigkeit, Robustheit und Geschwindigkeit unterscheiden. So sind viele Verfahren nur bedingt echtzeitfähig oder liefern nur mit hochaufgelösten Bildquellen ein zufriedenstellendes Ergebnis. In den letzten Jahren wurden daher Verfahren entwickelt, die versuchen, diese Problematiken zu lösen. In dieser Arbeit erfolgt eine Betrachtung dreier dieser State-of-the-Art Verfahren: Constrained Local Neural Fields (CLNF), Discriminative Response Map Fitting (DRMF) und Structured Output SVM (SO SVM), sowie deren Implementierungen. Dazu erfolgt ein empirischer Vergleich hinsichtlich der Detektionsgenauigkeit.
Pokémon Go was the first mobile Augmented Reality (AR) game that made it to the top of the download charts of mobile applications. However, very little is known about this new generation of mobile online Augmented Reality (AR) games. Existing media usage and technology acceptance theories provide limited applicability to the understanding of its users. Against this background, this research provides a comprehensive framework that incorporates findings from uses & gratification theory (U>), technology acceptance and risk research as well as flow theory. The proposed framework aims at explaining the drivers of attitudinal and intentional reactions, such as continuance in gaming or willingness to conduct in-app purchases. A survey among 642 Pokémon Go players provides insights into the psychological drivers of mobile AR games. Results show that hedonic, emotional and social benefits, and social norms drive, vice versa physical risks (but not privacy risks) hinder consumer reactions. However, the importance of these drivers differs between different forms of user behavior.
High quality decorative laminate panels typically consist of two major types of components: the surface layers comprising décor and overlay papers that are impregnated with melamine-based resins, and the core which is made of stacks of kraft papers impregnated with phenolic (PF) resin. The PF-impregnated layers impart superior hydrolytic stability, mechanical strength and fire-resistance to the composite. The manufacturing involves the complex interplay between resin, paper and impregnation/drying processes. Changes in the input variables cause significant alterations in the process characteristics and adaptations of the used materials and specific process conditions may, in turn, be required. This review summarizes the main variables influencing both processability and technological properties of phenolic resin impregnated papers and laminates produced therefrom. It is aimed at presenting the main influences from the involved components (resin and paper), how these may be controlled during the respective process steps (resin preparation and paper production), how they influence the impregnation and lamination conditions, how they affect specific aspects of paper and laminate performance, and how they interact with each other
(synergies).
Purpose: The purpose of this paper is to describe and discuss the current state of fashion business academic education worldwide. This is motivated by the wish to develop recommendations for the fashion business bachelor program of Reutlingen Uni versity.
Design/methodology/approach: This paper is based on a systematic review of relevant fashion business academic programs. A qualitative comparison is conducted through a categorization of the programs’ content and a score system evaluating the programs’ concepts.
Findings: Key findings were that several factors ensure successful fashion business education: Industry connections, international networks, project-based work, personalized career services and innovative approaches in teaching that include all steps along the fashion value chain.
Research limitations/implications: The research was primarily limited due to the limited number of schools assessed. As a result of the restricted time frame, those schools that were presented could only be analyzed regarding a few aspects. Future research should focus on a more in-depth analysis and further-reaching comparisons, e.g. comparisons with teaching concepts outside the fashion business area or with requirements by fashion companies.
First International Workshop on Hybrid dEveLopmENt Approaches in Software Systems Development
(2017)
A software process is the game plan to organize project teams and run projects. Yet, it still is a challenge to select the appropriate development approach for the respective context. A multitude of development approaches compete for the users’ favor, but there is no silver bullet serving all possible setups. Moreover, recent research as well as experience from practice shows companies utilizing different development approaches to assemble the bestfitting approach for the respective company: a more traditional process provides the basic framework to serve the organization, while project teams embody this framework with more agile (and/or lean) practices to keep their flexibility. The first HELENA workshop aims to bring together the community to discuss recent findings and to steer future work.
The ability to develop and deploy high-quality software at a high speed gets increasing relevance for the comptetitiveness of car manufacturers. Agile practices have shown benefits such as faster time to market in several application domains. Therefore, it seems to be promising to carefully adopt agile practices also in the automotive domain. This article presents findings from an interview-based qualitative survey. It aims at understanding perceived forces that support agile adoption. Particularly, it focuses on embedded software development for electronic control units in the automotive domain.
Layout generators, commonly denoted as PCells (parameterized cells), play an important role in the layout design of analog ICs (integrated circuits). PCells can automatically create parts of a layout, whose properties are controlled by the PCell parameters. Any layout, whether hand-crafted or automatically generated, has to be verified against design rules using a DRC (design rule check) in order to assure proper functionality and producibility. Due to the growing complexity of today’s PCells it would be beneficial if a PCell itself could be ensured to produce DRC clean layouts for any allowed parameter values, i.e. a formal verification of the PCell’s code rather than checking all possible instances of the PCell. In this paper we demonstrate the feasibility of such a formal PCell verification for a simple NMOS transistor PCell. The set from which the parameter values can be chosen was found during the verification process.
Under update intensive workloads (TPC, LinkBench) small updates dominate the write behavior, e.g. 70% of all updates change less than 10 bytes across all TPC OLTP workloads. These are typically performed as in-place updates and result in random writes in page-granularity, causing major write-overhead on Flash storage, a write amplification of several hundred times and lower device longevity.
In this paper we propose an approach that transforms those small in-place updates into small update deltas that are appended to the original page. We utilize the commonly ignored fact that modern Flash memories (SLC, MLC, 3D NAND) can handle appends to already programmed physical pages by using various low-level techniques such as ISPP to avoid expensive erases and page migrations. Furthermore, we extend the traditional NSM page-layout with a delta-record area that can absorb those small updates. We propose a scheme to control the write behavior as well as the space allocation and sizing of database pages.
The proposed approach has been implemented under Shore- MT and evaluated on real Flash hardware (OpenSSD) and a Flash emulator. Compared to In-Page Logging it performs up to 62% less reads and writes and up to 74% less erases on a range of workloads. The experimental evaluation indicates: (i) significant reduction of erase operations resulting in twice the longevity of Flash devices under update-intensive workloads; (ii) 15%-60% lower read/write I/O latencies; (iii) up to 45% higher transactional throughput; (iv) 2x to 3x reduction in overall write
amplification.