Refine
Year of publication
- 2020 (245) (remove)
Document Type
- Journal article (113)
- Conference proceeding (83)
- Book chapter (30)
- Report (8)
- Doctoral Thesis (4)
- Working Paper (4)
- Issue of a journal (2)
- Anthology (1)
Has full text
- yes (245) (remove)
Is part of the Bibliography
- yes (245)
Institute
- Informatik (89)
- ESB Business School (73)
- Life Sciences (33)
- Technik (29)
- Texoversum (20)
Publisher
- Springer (42)
- Hochschule Reutlingen (20)
- Elsevier (19)
- IEEE (16)
- MDPI (13)
- De Gruyter (10)
- Association for Computing Machinery (8)
- Wiley (8)
- Gesellschaft für Informatik e.V (6)
- AMD Akademie Mode & Design (5)
Customer orientation should be the core engine of every organisation. Information technology can be considered as the enabler to generate competitive advantages through customer processes in marketing, sales and service. The impact of information technologies is the biggest risk and at the same time a huge opportunity for any organisation. Research shows that Customer Relationship Management (CRM) enables organisations to perform better and focus more on their customers (e.g. market capitalisation of Amazon). While global enterprises are shaping the future of customer centricity and information technology, the question arises how German B2B organisations can shift their value contribution from product-centric to customer-centric. Therefore, these organisations are attempting to implement CRM software and putting their customers more into focus. However, the question remains, how organisations are approaching the implementation of CRM and if these attempts are paying off in terms of business performance.
Contributing to this highly topical discussion, this thesis contributes to the body of knowledge about the implementation of CRM in the German B2B sector and how it impacts their business performance. First, theoretical frameworks have been developed based on an extensive literature review. Hereby different aspects of CRM are worked-out and mapped against three dimensions of business performance, namely process efficiency, customer satisfaction and financial performance. Based on the theory, a conceptual framework was developed to test the relationships between CRM and Business Performance (BP). Therefore, a survey with 500 participants has been conducted. Based on this a measurement model was developed to test five main hypotheses.
The findings of these hypotheses suggest, that the implementation of CRM positively impacts business performance. In specific, the usage of analytical CRM and the establishment of a dedicated CRM success measurement correlate with the performance of German B2B organisations. In addition to these main findings, various key statements could be derived from the research and a measurement model was developed, which can be used for different organisational characteristics assessing BP. As a result, CRM implementations can be enhanced, and business performance can be improved.
Der ›AM Field Guide‹ gibt einen ersten strukturierten Überblick in die komplexe und vielschichtige Welt der additiven Fertigungsverfahren. Getrennt nach den Materialien Polymere, Metalle und weitere Materialien werden die gängigsten jeweils auf dem Markt angebotenen AM-Verfahren schematisch dargestellt und der Verfahrenskern in Kurzform beschrieben. Neben den hier dargestellten Hauptverfahren gibt es viele Derivate und Sonderverfahren, die ebenfalls ein gesetzt werden jedoch nicht explizit gezeigt sind. Zu beachten ist, dass in dem jungen Bereich der additiven Fertigung viele Hersteller ihren AM-Anwendungen eigene Namen geben, so dass eine allgemeingültige umfassende Klassifizierung nur ansatzweise erreichbar ist.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
The development of new materials that mimic cartilage and its function is an unmet need that will allow replacing the damaged parts of the joints, instead of the whole joint. Polyvinyl alcohol (PVA) hydrogels have raised special interest for this application due to their biocompatibility, high swelling capacity and chemical stability. In this work, the effect of post-processing treatments (annealing, high hydrostatic pressure (HHP) and gamma-radiation) on the performance of PVA gels obtained by cast-drying was investigated and, their ability to be used as delivery vehicles of the anti-inflammatories diclofenac or ketorolac was evaluated. HHP damaged the hydrogels, breaking some bonds in the polymeric matrix, and therefore led to poor mechanical and tribological properties. The remaining treatments, in general, improved the performance of the materials, increasing their crystallinity. Annealing at 150 °C generated the best mechanical and tribological results: higher resistance to compressive and tensile loads, lower friction coefficients and ability to support higher loads in sliding movement. This material was loaded with the anti-inflammatories, both without and with vitamin E (Vit.E) or Vit.E + cetalkonium chloride (CKC). Vit.E + CKC helped to control the release of the drugs which occurred in 24 h. The material did not induce irritability or cytotoxicity and, therefore, shows high potential to be used in cartilage replacement with a therapeutic effect in the immediate postoperative period.
Human bestrophin-1 (hBest1) is a transmembrane Ca2+- dependent anion channel, associated with the transport of Cl−, HCO3- ions, γ-aminobutiric acid (GABA), glutamate (Glu), and regulation of retinal homeostasis. Its mutant forms cause retinal degenerative diseases, defined as Bestrophinopathies. Using both physicochemical - surface pressure/mean molecular area (π/A) isotherms, hysteresis, compressibility moduli of hBest1/sphingomyelin (SM) monolayers, Brewster angle microscopy (BAM) studies, and biological approaches - detergent membrane fractionation, Laurdan (6-dodecanoyl-N,N-dimethyl-2-naphthylamine) and immunofluorescence staining of stably transfected MDCK-hBest1 and MDCK II cells, we report:
1) Ca2+, Glu and GABA interact with binary hBest1/SM monolayers at 35 °C, resulting in changes in hBest1 surface conformation, structure, self-organization and surface dynamics. The process of mixing in hBest1/SM monolayers is spontaneous and the effect of protein on binary films was defined as “fluidizing”, hindering the phase-transition of monolayer from liquid-expanded to intermediate (LE-M) state;
2) in stably transfected MDCK-hBest1 cells, bestrophin-1 was distributed between detergent resistant (DRM) and detergent-soluble membranes (DSM) - up to 30 % and 70 %, respectively; in alive cells, hBest1 was visualized in both liquid-ordered (Lo) and liquid-disordered (Ld) fractions, quantifying protein association up to 35 % and 65 % with Lo and Ld. Our results indicate that the spontaneous miscibility of hBest1 and SM is a prerequisite to diverse protein interactions with membrane domains, different structural conformations and biological functions.
Projektmanagement
(2020)
Projektmanagement ist ein Werkzeug um singuläre Aufgaben interdisziplinär und unternehmensübergreifend strukturiert zu bearbeiten, die einmalig und extrem bedeutsam für das Unternehmen sind sowie nicht einfach in der bestehenden Linienorganisation bearbeitet werden können. Unter Projektmanagement versteht man ein Konzept für die Leitung eines komplexen Vorhabens und die Institution, die dieses Vorhaben leitet.
Purpose. To improve the efficiency of the closed-cycle operation of the field-orientation induction machine in dynamic behavior when load conditions are changing, considering the nonlinearities of the main inductance.
Methodology. The optimal control problem is defined as the minimization of the time integral of the energy losses. The algorithm observed in this paper uses the Matlab/Simulink, dSPACE real-time interface, and C language. Handling real-time applications is made in ControlDesk experiment software for seamless ECU development.
Findings. Adiscrete-time model with an integrated predictive control scheme where the optimization is performed online at every sampling step has been developed. The optimal field-producing current trajectory is determined, so that the copper losses are minimized over a wide operational range. Additionally, the comparison of measurement results with conventional methods is provided, which validates the advantages and performance of the control scheme.
Originality. To solve the given problem, the information vector on the current state of the coordinates of the electromechanical system is used to form a controlling influence in the dynamic mode of operation. For the first time, the formation process of controls has considered the current state and the desired future state of the system in the real-time domain.
Practical value. Apredictive iterative approach for optimal flux level of an induction machine is important to generate the required electromagnetic torque and to reduce power losses simultaneously.
The planning and control of intralogistics systems in line with versatile production systems of smart factories requires new approaches and methods to cope with changing requirements within future factories. The planning of intralogistics can no longer follow a static, sequential approach as in the past since the planning assumptions are going to change in a high frequency. Reasons for these constant changes are amongst others external turbulences like rapidly changing market conditions, decreasing batch sizes down to customer-specific products with a batch size of one and on the other hand internal turbulences (like production and logistic resource breakdowns) affecting the production system. This paper gives an insight into research approaches and results how capabilities of intelligent logistical objects (intelligent bins, autonomous transport systems etc.) can be used to achieve a self-organized, cost and performance optimized intralogistics system with autonomously controlled process execution within versatile production environments. A first consistent method has been developed which has been validated and implemented within a scenario at the pilot factory Werk150 at the ESB Business School (Reutlingen University). Based on the incoming production orders, the method of the Extended Profitability Appraisal (EPA) covering the work system value to define the most effective work system for order fulfilment is applied. To derive the appropriate intralogistics processes, an autonomous control method involving principles of decentralized and target-oriented decision-making (e.g. intelligent bins are interacting with autonomously controlled transport systems to fulfil material orders of assembly workstations) has been developed and applied to achieve a target-optimized process execution. The results of the first stage research using predefined material sources and sinks described in this paper is going to set the basis for the further development of a self-organized and autonomously controlled method for intralogistics systems considering dynamic source and sink relations. By allowing dynamic shifts of production orders in the sense of dynamic source and sink relations the cost and performance aims of the intralogistics system can be directly aligned with the aims of the entire versatile production system in the sense of self-organized and autonomously controlled systems.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory.
Learning factories can complement each other by training different competencies in the field of digitalisation and Industry 4.0. They depict diverse sections of the product development process and focus on various technologies. Within the framework of the International Association of Learning Factories (IALF), the operating organisations of learning factories exchange information on research, training and education. One of the aims is to develop joint projects. The article presents different concepts of cooperation between learning factories while focusing on the improvement of the development of learners competencies e.g. with a broader range of topics. A concept of a joint course between the learning factories in Bochum, Reutlingen and Darmstadt is explained in detail. The three learning factories will be examined with regard to their similarities and differences. The joint course focuses on the target group of students and the topic of digitalisation in the development and production of products. The course and its contents are explained in detail. The new learning approach is evaluated on the basis of feedback from the participants. Finally, challenges resulting from the cooperation between learning factories at different locations and with different operating models will be discussed.
This paper presents a novel emulation concept for the test of smart contracts and Distributed Ledger Technologies (DLT) in distribute control or energy economy tasks and use cases. The concept uses state of the art behavioral modeling tools such as Matlab Simulink but presents a possible way to solve the shortfall of Simulink in communicating to DLT-Nodes directly. This is solved through a middleware solution. After this, an example used in verifying the test bed is presented and the target demonstration object is described. Finally, the possible expansion of the system is discussed and presented.
This paper aims at presenting a solution that enables end customers of the energy system to participate in new local micro-energy-markets by providing them with a distributed, decentralized, transparent and secure Peer to Peer (P2P) payment system, which functions automatically applying new concepts of Machine to Machine (M2M) communication technologies. This work was performed within the German project VK_2G, funded by the DBU. The key results were: Providing means to perform microtransactions in a P2P fashion between end consumers and prosumers in local communities at low cost in a transparent and secure manner; Developing a platform with pre-defined smart contracts able to be tailored to different end customers ‘needs in an easy way and; Integrating both the market platform as well as the local control of generation and loads. This solution has been developed, integrated and tested in a laboratory prototype. This paper discusses this solution and presents the results of the first test.
Facial expressions play a dominant role in facilitating social interactions. We endeavor to develop tactile displays to reinstate facial expression modulated communication. The high spatial and temporal dimensionality of facial movements poses a unique challenge when designing tactile encodings of them. A further challenge is developing encodings that are at-tuned to the perceptual characteristics of our skin. A caveat of using vibrotactile displays is that tactile stimuli have been shown to induce perceptual tactile aftereffects when used on the fingers, arm and face. However, at present, despite the prevalence of waist-worn tactile displays, no such investigations of tactile aftereffects at the waist region exist in the literature, though they are warranted by the unique sensory and perceptual signalling characteristics of this area. Using an adaptation paradigm we investigated the presence of perceptual tactile aftereffects induced by continuous and burst vibrotactile stimuli delivered at the navel, side and spinal regions of the waist. We report evidence that the tactile perception topology of the waist is non-uniform, and specifically that the navel and spine regions are resistant to adaptive aftereffects while side regions are more prone to perceptual adaptations to continuous but not burst stimulations. Results of our current investigations highlight the unique set of challenges posed by designing waist-worn tactile displays. These and future perceptual studies can directly inform more realistic and effective implementations of complex high-dimensional spatiotemporal social cues.
Melamine-formaldehyde resins are widely used for decorative paper impregnation. Resin properties relevant for impregnation are mainly determined already at the stage of resin synthesis by the applied reaction conditions. Thus, understanding the relationship between reaction conditions and technological properties is important. Response surface methodology based on orthogonal parameter level variations is the most suitable tool to identify and quantify factor effects and deduce causal correlation patterns. Here, two major process factors of MF resin synthesis were systematically varied using such a statistical experimental design. To arrive at resins having a broad range of technological properties, initial pH and M:F ratio were varied in a wide range (pH: 7.9–12.1; M:F ratio: 1:1.5–1:4.5). The impregnation behavior of the resins was modeled using viscosity, penetration rate and residual curing capacity as technological responses. Based on the response surface models, nonlinear and synergistic action of process factors was quantified and a suitable process window for preparing resins with favorable impregnation performance was defined. It was found that low M:F ratios (~1:2–1:2.5) and comparatively high starting pHs (~pH 11) yield impregnation resins with rapid impregnation behavior and good residual curing capacity.
Deutschland, quo vadis?
(2020)
Shutdown in Deutschland im März 2020. Stillstand in Handel und Industrie. Der Börsenwert einer beachtlichen Anzahl von Unternehmen hat sich in kürzester Zeit halbiert. Anleger warfen alles auf den Markt. Und bei der hohen Unsicherheit verloren sämtliche Anlageklassen, zeitweise sogar Gold. Selbst Konzerne wie die Lufthansa werden es ohne Staatshilfe nicht mehr schaffen zu existieren.
Die Blockchain-Technologie stellt einen vielversprechenden Ansatz für Transparenz und Resilienz in Lieferketten dar. In diesem Beitrag wird untersucht, welche Blockchain-Lösungen derzeit für die Supply-Chain zur Verfügung stehen, und die bislang umgesetzten Projekte in diesem Bereich analysiert. Die meisten der realisierten Projekte beziehen sich auf einfache Produkte und Supply-Chain-Strukturen. Der Grund ist, dass bislang Lösungen zur ganzheitlichen Abbildung von komplexen Produkten in dynamischen Supply-Chain-Strukturen gefehlt haben. Doch jetzt stehen erste vielversprechende Ansätze zur Verfügung.
Im Projekt GalvanoFlex_BW sind verschiedene Möglichkeiten zur Verbesserung der Energieeffizienz von energieintensiven Industrieunternehmen aufgezeigt und untersucht worden. Die Einführung der KWK im stromoptimierten Betrieb stellte dabei einen besonders betrachteten Aspekt dar. Neben der technischen Untersuchung ist zudem eine sozialwissenschaftliche Betrachtung vorgenommen worden, um die Einführung und Umsetzung entsprechender Maßnahmen auch unter diesem Aspekt zu betrachten. Ein zusätzlicher wichtiger Schwerpunkt des Projektes war die Übertragung des erarbeiteten Wissens an weitere Unternehmen, Institutionen etc., die nicht direkt am Projekt beteiligt waren.
Durch die Zusammenarbeit von vier Forschungs- und drei Industriepartnern sind die Arbeiten praxisorientiert auf Basis realer Messdaten sowie im Zuge von Befragungen der handelnden Personen bei den Projektpartnern im Sinne eines Reallabors durchgeführt worden.
Die Notwendigkeit zur Einsparung von Energie und damit zur Umsetzung von Effizienzmaßnahmen ist ein wichtiger Schritt, um die von der EU geplante Reduktion von Treibhausgasen zu erreichen. Darüber hinaus ist zu erwarten, dass die Energiekosten auch zukünftig weiter ansteigen. Dadurch werden Maßnahmen zur Steigerung der Energieeffizienz zu einem wichtigen Element, eine wettbewerbsfähige Produktion zu gewährleisten. Im Rahmen des Projektes sind deshalb verschiedene Energieeffizienzmaßnahmen speziell für die Galvanotechnik recherchiert, analysiert und in einen Maßnahmenkatalog überführt worden. Des Weiteren wurde eine Bewertungsmethode entwickelt, die die Unternehmen der Galvanotechnik bei der Identifikation von sinnvollen Energieeffizienzmaßnahmen unterstützen soll.
Bei den Untersuchungen zur Umsetzung von Kraft-Wärme-Kopplungsanlagen konnte am Beispiel von zwei Unternehmen mit stark unterschiedlichen Strom- und Wärmebedarfswerten gezeigt werden, dass der Einsatz entsprechender Anlagen wirtschaftlich lohnenswert ist. Im günstigsten Fall ergeben sich Amortisationszeiten von etwa zwei Jahren. Es hat sich weiterhin gezeigt, dass die Auslegung des Block-heizkraftwerkes stark von den Strom- und Wärmebedarfswerten abhängt und dass der Pufferspeicher keinesfalls zu klein ausgelegt werden sollte. Eine intelligente stromoptimierte Steuerung mit Lastspitzenmanagement kann die Wirtschaftlichkeit jedoch häufig nur noch wenig gegenüber dem wärmegeführten Betrieb verbessern. Ursache dafür ist, dass das derzeitige Förder- und Vergütungssystem für Kraft-Wärme-Kopplungsanlagen nahezu keine Anreize für einen am Strombedarf und damit an der Deckung von Residuallast orientierten Betrieb bietet. Einzig bei Unternehmen mit ausgeprägten Spitzen im Strombezug, kann der gezielte Einsatz eines Blockheizkraftwerkes zu einer signifikanten Senkung des Leistungspreises führen.
Darüber hinaus ist die Einführung einer Kraft-Wärme-Kopplungsanlage generell als komplexe Energieeffizienzmaßnahme anzusehen, die deshalb neben den wirtschaftlichen Aspekten erhöhte Anforderungen an die Unternehmen und das professionelle Umfeld stellt, die im Rahmen der sozialwissenschaftlichen Begleitforschung untersucht worden sind. Dabei konnten Treiber aber auch Hemmnisse zur Umsetzung der Technologie sowohl innerhalb der Unternehmen als auch außerhalb identifiziert werden. Die internen Hemmnisse sind dabei auf unterschiedliche Ursachen zurückzuführen, wie die hohe Komplexität der KWK-Technologie, die schwierige Bewertung des Gesamtnutzens im Unternehmen, die mangelnde personelle Ausstattung und fehlende Unternehmerentscheidungen. Abhilfe schaffen, und damit als Treiber wirken, können hier ein verbessertes Beratungsangebot insbesondere seitens neutraler Stellen sowie der Anlagenbetrieb im Contracting.
Die Übertragung der im Projekt erarbeiteten Ergebnisse ist bereits während der Projektlaufzeit über die Branchenplattform im Zuge verschiedener Workshops, die speziell auf Unternehmen und Institutionen außerhalb des Projektkonsortiums ausgerichtet waren, erfolgt. Zur Verstärkung der Verbreitung des erarbeiteten Wissens ist zum Projektende eine Serie aus vier Fachartikeln in einem namhaften Branchenmagazin erschienen, und es ist eine Internetseite zum Projekt erstellt worden (www.galvanoflex_bw.de). Letztere hat dabei nicht nur die Aufgabe der Wissensverbreitung, sondern sie soll auch über das Projektende hinaus als Kontaktplattform dienen, um die Umsetzung von Effizienzmaßnahmen mit dem im Projekt generierten Wissen zu unterstützen.
Automatic anode rod inspection in aluminum smelters using deep-learning techniques: a case study
(2020)
Automatic fault detection using machine learning has become an exciting and promising area of research. This because it accurate and timely way to manage and classify with minimal human effort. In the computer vision community, deep-learning methods have become the most suitable approaches for this task. Anodes are large carbon blocks that are used to conduct electricity during the aluminum reduction process. The most basic function of anode rod inspection is to prevent a situation where the anode rod will not fit into the stub-holes of a new anode. It would be the case for a rod containing either severe toe-in, missing stubs, or a retained thimble on one or more stubs. In this work, to improve the accuracy of shape defect inspection for an anode rod, we use the Fast Region-based Convolutional Network method (Fast R-CNN), model. To train the detection model, we collect an image dataset composed of multi-class of anode rod defects with annotated labels. Our model is trained using a small number of samples, an essential requirement in the industry where the number of available defective samples is limited. It can simultaneously detect multi-class of defects of the anode rod in nearly real-time.
Enhancing data-driven algorithms for human pose estimation and action recognition through simulation
(2020)
Recognizing human actions, reliably inferring their meaning and being able to potentially exchange mutual social information are core challenges for autonomous systems when they directly share the same space with humans. Intelligent transport systems in particular face this challenge, as interactions with people are often required. The development and testing of technical perception solutions is done mostly on standard vision benchmark datasets for which manual labelling of sensory ground truth has been a tedious but necessary task. Furthermore, rarely occurring human activities are underrepresented in these datasets, leading to algorithms not recognizing such activities. For this purpose, we introduce a modular simulation framework, which offers to train and validate algorithms on various human-centred scenarios. We describe the usage of simulation data to train a state-of-the-art human pose estimation algorithm to recognize unusual human activities in urban areas. Since the recognition of human actions can be an important component of intelligent transport systems, we investigated how simulations can be applied for his purpose. Laboratory experiments show that we can train a recurrent neural network with only simulated data based on motion capture data and 3D avatars, which achieves an almost perfect performance in the classification of those human actions on real data.
The design process for a single phase, smart, universal charger for light electric vehicles, is presented. With a step up, power factor correction circuit, followed by a phase shifted, full bridge converter, with synchronous rectification on the secondary side. Due to the resistor-capacitor-diode snubber on the secondary side, the current peak at the start of power transfer, leads to false triggering during light load control with peak current mode control. The solution developed for light loads, is to change from peak current control to voltage control. This is achieved by limiting the maximum phase shift, instead of changing the reference value. For the power factor correction stage, measured and calculated efficiencies are compared as a function of the output power. The voltage and current waveforms are shown for the power factor correction circuit, and for the phase shifted bridge, the measured current waveform is compared with simulation.
Sol-Gel basierte Flammschutzmittel stellen einen vielversprechenden Ansatz für Textilien dar, gerade im Bereich des Ersatzes von derzeit etablierten halogenhaltigen Flammschutzmitteln. Letztere sind aufgrund ihrer toxikologisch Bedenklichkeit sowie ihrer mitunter bioakkumulierenden Eigenschaften in die Kritik geraten. In diesem Forschungsvorhaben wurde daher untersucht wie aus Phosphor- und stickstoffhaltige Silane halogenfreie Flammschutzmittel verwirklicht werden können. Die Sol-Gel-Schicht fungierte dabei zum einen als nicht brennbarer Binder, zum anderen konnten über das Anbinden von Phosphorgruppen in an kommerziell verfügbare Silane Flammschutz aktive Gruppen direkt mit eingebunden werden. Verschiedene Syntheseansätze wurden dabei verfolgt, wobei durch alle hergestellten N-P-Silane ein Flammschutz nach DIN EN ISO 15025 (Schutzkleidung – Schutz gegen Hitze und Flammen) erhalten wurden. Dabei hängt die Flammschutzwirkung stark von den funktionellen Gruppen und der Oxidationsstufe des Phosphors ab, dabei konnte ein entsprechender Flammschutz bei Auflagen von 5 % erzielt werden. Es konnte gezeigt werden, dass ein Mechanismus auf Basis der Bildung einer Schutzschicht hauptsächlich verantwortlich für den Flammschutz ist. Dieses Ergebnis ist für eine zukünftige, weitere Optimierung entsprechender Ausrüstungen nicht zu unterschätzen. Durch Ausrüstungsversuche im semi-industriellen Maßstab konnte weiterhin gezeigt werden, dass einer großtechnischen Umsetzung der angewandten Ausrüstungen prinzipiell nichts im Wege steht. Je nach funktioneller Gruppe am Phosphor konnte die Wasserlöslichkeit und die Waschstabilität kontrolliert werden. Dabei konnte zum einen gezeigt werden, dass hydrophobes N-P-Silane eine bessere Waschbeständigkeit aufweisen, hydrophile N-P-Silane erhalten diese erst bei Fixierungstemperaturen von 180 °C. Ausgehend von den Ergebnissen konnten sticktstoffgenerierende und cyanursäure-basierte N-P-Silane entwickelt werden, welche sich besonders in einer guten Flammschutzwirkung bei Mischgeweben auszeichnen. Insgesamt konnte innerhalb des Forschungsvorhabens gezeigt werden, dass N-P-Silane hervorragende permanente Flammschutzmittel für Textilien sind und auf welchem Mechanismus dieser Flammschutz begründet ist.
Due to decreased mobility or families living apart, older adults are especially vulnerable to the issue of social isolation. Literature suggests that technology can help to prevent this isolation. The present work addresses an approach to participate in society by sharing knowledge that is cherished. We propose the cooking recipe exchange application PrecRec for older adults to make them feel precious and valued. PrecRec has been developed and evaluated in an iterative process with eleven older adults. The results show that a broad perspective has to be taken into account when designing such systems.
Hypermedia as the Engine of Application State (HATEOAS) is one of the core constraints of REST. It refers to the concept of embedding hyperlinks into the response of a queried or manipulated resource to show a client possible follow-up actions and transitions to related resources. Thus, this concept aims to provide a client with a navigational support when interacting with a Web-based application. Although HATEOAS should be implemented by any Web-based API claiming to be RESTful, API providers tend to offer service descriptions in place of embedding hyperlinks into responses. Instead of relying on a navigational support, a client developer has to read the service description and has to identify resources and their URIs that are relevant for the interaction with the API. In this paper, we introduce an approach that aims to identify transitions between resources of a Web-based API by systematically analyzing the service description only. We devise an algorithm that automatically derives a URI Model from the service description and then analyzes the payload schemas to identify feasible values for the substitution of path parameters in URI Templates. We implement this approach as a proxy application, which injects hyperlinks representing transitions into the response payload of a queried or manipulated resource. The result is a HATEOAS-like navigational support through an API. Our first prototype operates on service descriptions in the OpenAPI format. We evaluate our approach using ten real-world APIs from different domains. Furthermore, we discuss the results as well as the observations captured in these tests.
With the continuous development of economy, consumers pay more attention to the demand for personalization clothing. However, the recommendation quality of the existing clothing recommendation system is not enough to meet the user’s needs. When browsing online clothing, facial expression is the salient information to understand the user’s preference. In this paper, we propose a novel method to automatically personalize clothing recommendation based on user emotional analysis. Firstly, the facial expression is classified by multiclass SVM. Next, the user’s multi-interest value is calculated using expression intensity that is obtained by hybrid RCNN. Finally, the multi-interest value is fused to carry out personalized recommendation. The experimental results show that the proposed method achieves a significant improvement over other algorithms.
Deep learning-based fabric defect detection methods have been widely investigated to improve production efficiency and product quality. Although deep learning-based methods have proved to be powerful tools for classification and segmentation, some key issues remain to be addressed when applied to real applications. Firstly, the actual fabric production conditions of factories necessitate higher real-time performance of methods. Moreover, fabric defects as abnormal samples are very rare compared with normal samples, which results in data imbalance. It makes model training based on deep learning challenging. To solve these problems, an extremely efficient convolutional neural network, Mobile-Unet, is proposed to achieve the end-to-end defect segmentation. The median frequency balancing loss function is used to overcome the challenge of sample imbalance. Additionally, Mobile-Unet introduces depth-wise separable convolution, which dramatically reduces the complexity cost and model size of the network. It comprises two parts: encoder and decoder. The MobileNetV2 feature extractor is used as the encoder, and then five deconvolution layers are added as the decoder. Finally, the softmax layer is used to generate the segmentation mask. The performance of the proposed model has been evaluated by public fabric datasets and self-built fabric datasets. In comparison with other methods, the experimental results demonstrate that segmentation accuracy and detection speed in the proposed method achieve state-of-the-art performance.
”I have never seen one who loves virtue as much as he loves beauty,” Confucius once said. If beauty is more important as goodness, it becomes clear why people invest so much effort in their first impression. The aesthetic of faces has many aspects and there is a strong correlation to all characteristics of humans, like age and gender. Often, research on aesthetics by social and ethic scientists lacks sufficient labelled data and the support of machine vision tools. In this position paper we propose the Aesthetic-Faces dataset, containing training data which is labelled by Chinese and German annotators. As a combination of three image subsets, the AF-dataset consists of European, Asian and African people. The research communities in machine learning, aesthetics and social ethics can benefit from our dataset and our toolbox. The toolbox provides many functions for machine learning with state-of-the-art CNNs and an Extreme-Gradient-Boosting regressor, but also 3D Morphable Model technolo gies for face shape evaluation and we discuss how to train an aesthetic estimator considering culture and ethics.
3D assisted 2D face recognition involves the process of reconstructing 3D faces from 2D images and solving the problem of face recognition in 3D. To facilitate the use of deep neural networks, a 3D face, normally represented as a 3D mesh of vertices and its corresponding surface texture, is remapped to image-like square isomaps by a conformal mapping. Based on previous work, we assume that face recognition benefits more from texture. In this work, we focus on the surface texture and its discriminatory information content for recognition purposes. Our approach is to prepare a 3D mesh, the corresponding surface texture and the original 2D image as triple input for the recognition network, to show that 3D data is useful for face recognition. Texture enhancement methods to control the texture fusion process are introduced and we adapt data augmentation methods. Our results show that texture-map-based face recognition can not only compete with state-of-the-art systems under the same precon ditions but also outperforms standard 2D methods from recent years.
It has been widely shown that biomaterial surface topography can modulate host immune response, but a fundamental understanding of how different topographies contribute to pro-inflammatory or anti-inflammatory responses is still lacking. To investigate the impact of surface topography on immune response, we undertook a systematic approach by analyzing immune response to eight grades of medical grade polyurethane of increasing surface roughness in three in vitro models of the human immune system. Polyurethane specimens were produced with defined roughness values by injection molding according to the VDI 3400 industrial standard. Specimens ranged from 0.1 μm to 18 μm in average roughness (Ra), which was confirmed by confocal scanning microscopy. Immunological responses were assessed with THP-1-derived macrophages, human peripheral blood mononuclear cells (PBMCs), and whole blood following culture on polyurethane specimens. As shown by the release of pro-inflammatory and anti-inflammatory cytokines in all three models, a mild immune response to polyurethane was observed, however, this was not associated with the degree of surface roughness. Likewise, the cell morphology (cell spreading, circularity, and elongation) in THP-1-derived macrophages and the expression of CD molecules in the PBMC model on T cells (HLA-DR and CD16), NK cells (HLA-DR), and monocytes (HLA-DR, CD16, CD86, and CD163) showed no influence of surface roughness. In summary, this study shows that modifying surface roughness in the micrometer range on polyurethane has no impact on the pro-inflammatory immune response. Therefore, we propose that such modifications do not affect the immunocompatibility of polyurethane, thereby supporting the notion of polyurethane as a biocompatible material.
Appropriate mechanical properties and fast endothelialization of synthetic grafts are key to ensure long-term functionality of implants. We used a newly developed biostable polyurethane elastomer (TPCU) to engineer electrospun vascular scaffolds with promising mechanical properties (E-modulus: 4.8 ± 0.6 MPa, burst pressure: 3326 ± 78 mmHg), which were biofunctionalized with fibronectin (FN) and decorin (DCN). Neither uncoated nor biofunctionalized TPCU scaffolds induced major adverse immune responses except for minor signs of polymorph nuclear cell activation. The in vivo endothelial progenitor cell homing potential of the biofunctionalized scaffolds was simulated in vitro by attracting endothelial colony-forming cells (ECFCs). Although DCN coating did attract ECFCs in combination with FN (FN + DCN), DCN-coated TPCU scaffolds showed a cell-repellent effect in the absence of FN. In a tissue-engineering approach, the electrospun and biofunctionalized tubular grafts were cultured with primary-isolated vascular endothelial cells in a custom-made bioreactor under dynamic conditions with the aim to engineer an advanced therapy medicinal product. Both FN and FN + DCN functionalization supported the formation of a confluent and functional endothelial layer.
We discuss the fabrication technologies for IC chips in this chapter. We will focus on the main process steps and especially on those aspects that are of particular importance for understanding how they affect, and in some cases drive, the layout of ICs. All our analyses in this chapter will be for silicon as the base material; the principles and understanding gained can be applied to other substrates as well. Following a brief introduction to the fundamentals of IC fabrication (Sect. 2.1) and the base material used in it, namely silicon (Sect. 2.2), we discuss the photolithography process deployed for all structuring work in Sect. 2.3. We will then present in Sect. 2.4 some theoretical opening remarks on typical phenomena encountered in IC fabrication. Knowledge of these phenomena is very useful for understanding the process steps we cover in Sects. 2.5–2.8. We examine a simple exemplar process in Sect. 2.9 and observe how a field-effect transistor (FET) – the most important device in modern integrated circuits—is created. To drive the key points home, we provide a review of each topic at the end of every section from the point of view of layout design by discussing relevant physical design aspects.
Das ZD.BB - Digitaler Hub für kleine und mittelständische Unternehmen in der Region Stuttgart
(2020)
Die Digitale Transformation ist eines der meistdiskutierten Themen in der heutigen Geschäftswelt. Viele Unternehmen, vor allem kleine und mittelständische Unternehmen (KMU), tun sich schwer die Chancen und Risiken der Digitalisierung einzuschätzen. Mit all den Möglichkeiten und Chancen, welche die Digitalisierung birgt, droht Unternehmen, die sich vor den Entwicklungen verschließen, der Verlust ihrer Markt- und Wettbewerbsposition. Mit dem im Februar 2019 eröffneten Digital Hub ZD.BB (Zentrum Digitalisierung) besteht in der Region Stuttgart eine neue, zentrale Anlaufstelle für Fragen rund um das Thema Digitalisierung. Am ZD.BB erhalten kleine und mittelständische Unternehmen (KMU) sowie Startups für ihre digitalen Transformationsprozesse eine kompetente Beratung und Betreuung. Sie geht von der Sensibilisierung über die Analyse bis zur Lösungsentwicklung für digitale Prozesse. Mithilfe einer digitalen Qualifizierungsoffensive und mittelstandsgerechten Methoden zur Geschäftsmodellentwicklung werden Unternehmen im ZD.BB umfassend bei ihren Digitalisierungsvorhaben unterstützt. Dazu werden in Innovationslaboren, in Coworking Spaces und bei Events unterschiedliche Kompetenzen, Disziplinen, Ideen, Technologien und Kreativität vernetzt und auf diese Weise digitale Innovationen hervorgebracht.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
Additive Manufacturing is increasingly used in the industrial sector as a result of continuous development. In the Production Planning and Control (PPC) system, AM enables an agile response in the area of detailed and process planning, especially for a large number of plants. For this purpose, a concept for a PPC system for AM is presented, which takes into account the requirements for integration into the operational enterprise software system. The technical applicability will be demonstrated by individual implemented sections. The presented solution approach promises a more efficient utilization of the plants and a more elastic use.
Personalmanagement
(2020)
Auch wenn der Wert in keiner Bilanz auftaucht: das Humankapital entscheidet über den Unternehmenserfolg. Während Kapital im Überfluss vorhanden ist, ist das Personal zunehmend der Engpassfaktor. Wurde bis in die 1980er-Jahre der Mensch als Produktionsfaktor und die Personalabteilung als seine Verwaltungsinstanz gesehen, so ist die Personalarbeit heute ein integratives Element des Managementprozesses und die Personalabteilung aktiver Teil des Managementteams (Scholz 2014c). Damit verbunden ist der begriffliche Wandel von Personalwirtschaft bzw. Personalverwaltung hin zum Personalmanagement bzw. Human Ressource Management (HRM). Die Begriffe signalisieren eine stärker strategisch ausgerichtete Auseinandersetzung mit allen Fragen, die den Einsatz von Personal und die Verknüpfung der Personal- mit der Unternehmensstrategie zum Gegenstand haben.
Wichtige Aufgaben der Personalarbeit sind Personalplanung, Personalbeschaffung, Personalentwicklung, Personaleinsatz, Personalkostenmanagement, Personalführung. Diese werden in der Regel von unterschiedlichen Stellen wahrgenommen – neben der Personalabteilung spielen dabei auch die direkte Führungskraft sowie die Unternehmensleitung eine wichtige Rolle.
Planning of available resources considering ergonomics under deterministic highly variable demand
(2020)
In this paper, a method for hybrid short- to long-term planning of available resources for operations is presented, which is based on a known or deterministically forecasted but highly variable demand. The method considers quantitative measures such as the performance and the availability of resources, ergonomically relevant KPI and ultimately process costs in order to serve as a pragmatic planning tool for operations managers in SMEs. Specifically, the method enables exploiting the ergonomic advantages of available flexible automation technology (e.g. AGVs or picking robots), while assuring that these do not represent a capacity bottleneck. After presenting the method along with the necessary assumptions, mainly concerning the availability of data for the calculations, we report a case study that quantifies the impact of throughput variability on the selection of different process alternatives, where different teams of resources are used.
It is essential for the success of a company to set a strategic direction in which a product offering will be developed over time to achieve the company vision. For this reason, roadmaps are used in practice. in general, roadmaps can be expressed in various forms such as technology roadmaps, product roadmaps or industry roadmaps. From the point of view of industry, the basic purpose of a roadmap is to explore, visualize and communicate the dynamic linkage between markets, products and technology.
Nowadays companies are facing increasing market dynamics, rapidly evolving technologies and shifting user expectations. Together with the adoption of lean and agile practices this situation makes it increasingly difficult to plan and predict upfront which products, services or features should be developed in the future. Consequently, many organizations are struggling with their ability to provide reliable and stable product roadmaps by applying traditional approaches. This paper aims at identifying and getting a better understanding of which measures companies have taken to transform their current product roadmapping practices to the requirements of a dynamic and uncertain market environment. This also includes challenges and success factors within this transformation process as well as measures that companies have planned for the future. We conducted 18 semi-structured expert interviews with practitioners of different companies and performed a thematic data analysis. The study shows that the participating companies are aware that the transformation of traditional product roadmapping practices to fulfill the requirements of a dynamic and uncertain market environment is necessary. The most important measures that the participating companies have taken are 1) adequate item planning concerning the timeline, 2) the replacement of a fixed time-based chart by a more flexible structure, 3) the use of outcomes to determine the items (such as features) on the a roadmap, 4) the creation of a central roadmap which allows deriving different representation for each stakeholder and department.
Context: A product roadmap is an important tool in product development. It sets the strategic direction in which the product is to be developed to achieve the company’s vision. However, for product roadmaps to be successful, it is essential that all stakeholders agree with the company’s vision and objectives and are aligned and committed to a common product plan.
Objective: In order to gain a better understanding of product roadmap alignment, this paper aims at identifying measures, activities and techniques in order to align the different stakeholders around the product roadmap.
Method: We conducted a grey literature review according the guidelines to Garousi et al.
Results: Several approaches to gain alignment were identified such as defining and communicating clear objectives based on the product vision, conducting cross-functional workshops, shuttle diplomacy, and mission briefing. In addition, our review identified the “Behavioural Change Stairway Model” that suggests five steps to gain alignment by building empathy and a trustful relationship.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach.
Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts.
Method: We conducted semi-structured expert interviews with 15 experts from 13 German companies and conducted athematic data analysis.
Results: The analysis showed that a significant number of companies is still struggling with traditional feature-based product-roadmapping and opinion-based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and establishing discovery activities.
The emergence of agile methods and practices has not only changed the development processes but might also have affected how companies conduct software process improvement (SPI). Through a set of complementary studies, we aim to understand how SPI has changed in times of agile software development. Specifically, we aim (1) to identify and characterize the set of publications that connect elements of agility to SPI, (2) to explore to which extent agile methods/practices have been used in the context of SPI, and (3) to understand whether the topics addressed in the literature are relevant and useful for industry professionals. To study these questions, we conducted an in-depth analysis of the literature identified in a previous mapping study, an interview study, and an analysis of the responses given by industry professionals to SPI-related questions stemming from an independently conducted survey study.
A fast way to test business ideas and to explore customer problems and needs is to talk to them. Customer interviews help to understand what solutions customers will pay for before investing valuable resources to develop solutions. Customer interviews are a good way to gain qualitative insights. However, conducting interviews can be a difficult procedure and requires specific skills. The current ways of teaching interview skills have significant deficiencies. They especially lack guidance and opportunities to practice. Objective: The goal of this work is to develop and validate a workshop format to teach interview skills for conducting good customer interviews in a practical manner. Method: The research method is based on design science research which serves as a framework. A game-based workshop format was designed to teach interview skills. The approach consists of a half-day, hands-on workshop and is based on an analysis of necessary interview skills. The approach has been validated in several workshops and improved based on learnings from those workshops. Results: Results of the validation show that participants could significantly improve their interview skills while enjoying the game-based exercises. The game-based learning approach supports learning and practicing customer interview skills with playful and interactive elements that encourage greater motivation among participants to conduct interviews.
In recent years companies have faced challenges by high market dynamics, rapidly evolving technologies and shifting user expectations. Together with the adaption of lean and agile practices, it is increasingly difficult to predict upfront which products, features or services will satisfy the needs of the customers and the organization. Currently, many new products fail to produce a significant financial return. One reason is that companies are not doing enough product discovery activities. Product discovery aims at tackling the various risks before the implementation of a product starts. The academic literature only provides little guidance for conducting product discovery in practice. Objective: In order to gain a better understanding of product discovery activities in practice, this paper aims at identifying motivations, approaches, challenges, risks, and pitfalls of product discovery reported in the grey literature. Method: We performed a grey literature review (GLR) according to the guidelines to Garousi et al. Results: The study shows that the main motivation for conducting product discovery activities is to reduce the uncertainty to a level that makes it possible to start building a solution that provides value for the customers and the business. Several product discovery approaches are reported in the grey literature which include different phases such as alignment, problem exploration, ideation, and validation. Main challenges are, among others, the lack of clarity of the problem to be solved, the prescription of concrete solutions through management or experts, and the lack of cross-functional collaboration.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Today, many companies are adapting their strategy, business models, products, services as well as business processes and information systems in order to expand their digitalization level through intelligent systems and services. The paper raises an important question: What are cognitive co-creation mechanisms for extending digital services and architectures to readjust the usage value of smart services? Typically, extensions of digital services and products and their architectures are manual design tasks that are complex and require specialized, rare experts. The current publication explores the basic idea of extending specific digital artifacts, such as intelligent service architectures, through mechanisms of cognitive co-creation to enable a rapid evolutionary path and better integration of humans and intelligent systems. We explore the development of intelligent service architectures through a combined, iterative, and permanent task of co-creation between humans and intelligent systems as part of a new concept of cognitively adapted smart services. In this paper, we present components of a new platform for the joint co-creation of cognitive services for an ecosystem of intelligent services that enables the adaptation of digital services and architectures.
AI technologies such as deep learning provide promising advances in many areas. Using these technologies, enterprises and organizations implement new business models and capabilities. In the beginning, AI-technologies have been deployed in an experimental environment. AI-based applications have been created in an ad-hoc manner and without methodological guidance or engineering approach. Due to the increasing importance of AI-technologies, however, a more structured approach is necessary that enable the methodological engineering of AI-based applications. Therefore, we develop in this paper first steps towards methodological engineering of AI-based applications. First, we identify some important differences between the technological foundations of AI- technologies, in particular deep learning, and traditional information technologies. Then we create a framework that enables to engineer AI-applications using four steps: identification of an AI-application type, sub-type identification, lifecycle phase, and definition of details. The introduced framework considers that AI-applications use an inductive approach to infer knowledge from huge collections and streams of data. It not only enables the rapid development of AI-application but also the efficient sharing of knowledge on AI-applications.
Digital technologies are main strategic drivers for digitalization and offer ubiquitous data availability, unlimited connectivity, and massive processing power for a fundamentally changing business. This leads to the development and application of intelligent digital systems. The current state of research and practice of architecting digital systems and services lacks a solid methodological foundation that fully accommodates all requirements linked to efficient and effective development of digital systems in organizations. Research presented in this paper addresses the question, how management of complexity in digital systems and architectures can be supported from a methodological perspective. In this context, the current focus is on a better understanding of the causes of increased complexity and requirements to methodological support. For this purpose, we take an enterprise architecture perspective, i.e. how the introduction of digital systems affects the complexity of EA. Two industrial case studies and a systematic literature analysis result in the proposal of an extended Digital Enterprise Architecture Cube as framework for future methodical support.
Artificial Intelligence-based Assistants AIAs are spreading quickly both in homes and offices. They already have left their original habitats of "intelligent speakers" providing easy access to music collections. The initiated a multitude of new devices and are already populating devices such as TV sets. Characteristic for the intelligent digital assistants is the formation of platforms around their core functionality. Thus, AIS capabilities of the assistants are used to offer new services and create new interfaces for business processes. There are positive network effects between the assistants and the services as well as within the services. Therefore, many companies see the need to get involved in the field of digital assistants but lack a framework to align their initiatives with their corporate strategies. In order to lay the foundation for a comprehensive method, we are therefore investigating intelligent digital assistants. Based on this analysis, we are developing a framework of strategic opportunities and challenges.
Intelligent systems and services are the strategic targets of many current digitalization efforts and part of massive digital transformations based on digital technologies with artificial intelligence. Digital platform architectures and ecosystems provide an essential base for intelligent digital systems. The paper raises an important question: Which development paths are induced by current innovations in the field of artificial intelligence and digitalization for enterprise architectures? Digitalization disrupts existing enterprises, technologies, and economies and promotes the architecture of cognitive and open intelligent environments. This has a strong impact on new opportunities for value creation and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, service computing, cloud computing, blockchains, big data with analysis, mobile systems, and social business network systems are essential drivers of digitalization. We investigate the development of intelligent digital systems supported by a suitable digital enterprise architecture. We present methodological advances and an evolutionary path for architectures with an integral service and value perspective to enable intelligent systems and services that effectively combine digital strategies and digital architectures with artificial intelligence.
Artificial Intelligence enables innovative applications, and applications based on Artificial Intelligence are increasingly important for all aspects of the Digital Economy. However, the question of how AI resources such as tools and data can be linked to provide an AI-capability and create business value is still open. Therefore, this paper identifies the value-creating mechanisms of connectionist artificial intelligence using a capability-oriented view and points out the connections to different kinds of business value. The analysis supports an agenda that identifies areas that need further research to understand the mechanism of value creation in connectionist artificial intelligence.
Simple MOSFET models intended for hand analysis are inaccurate in deep sub-micrometer process technologies and in the moderate inversion region of device operation. Accurate models, such as the Berkeley BSIM6 model, are too complex for use in hand analysis and are intended for circuit simulators. Artificial neural networks (ANNs) are efficient at capturing both linear and non-linear multivariate relationships. In this work, a straightforward modeling technique is presented using ANNs to replace the BSIM model equations. Existing open-source libraries are used to quickly build models with error rates generally below 3%. When combined with a novel approach, such as the gm/Id systematic design method, the presented models are sufficiently accurate for use in the initial sizing of analog circuit components without simulation.
Scenario-based analysis is a comprehensive technique to evaluate software quality and can provide more detailed insights than e.g. maintainability metrics. Since such methods typically require significant manual effort, we designed a lightweight scenario-based evolvability evaluation method. To increase efficiency and to limit assumptions, the method exclusively targets service- and microservice-based systems. Additionally, we implemented web-based tool support for each step. Method and tool were also evaluated with a survey (N=40) that focused on change effort estimation techniques and hands-on interviews (N=7) that focused on usability. Based on the evaluation results, we improved method and tool support further. To increase reuse and transparency, the web-based application as well as all survey and interview artifacts are publicly available on GitHub. In its current state, the tool-supported method is ready for first industry case studies.
While many maintainability metrics have been explicitly designed for service-based systems, tool-supported approaches to automatically collect these metrics are lacking. Especially in the context of microservices, decentralization and technological heterogeneity may pose challenges for static analysis. We therefore propose the modular and extensible RAMA approach (RESTful API Metric Analyzer) to calculate such metrics from machine-readable interface descriptions of RESTful services. We also provide prototypical tool support, the RAMA CLI, which currently parses the formats OpenAPI, RAML, and WADL and calculates 10 structural service-based metrics proposed in scientific literature. To make RAMA measurement results more actionable, we additionally designed a repeatable benchmark for quartile-based threshold ranges (green, yellow, orange, red). In an exemplary run, we derived thresholds for all RAMA CLI metrics from the interface descriptions of 1,737 publicly available RESTful APIs. Researchers and practitioners can use RAMA to evaluate the maintainability of RESTful services or to support the empirical evaluation of new service interface metrics.
Context: Fast moving markets and the age of digitization require that software can be quickly changed or extended with new features. The associated quality attribute is referred to as evolvability: the degree of effectiveness and efficiency with which a system can be adapted or extended. Evolvability is especially important for software with frequently changing requirements, e.g. internet-based systems. Several evolvability-related benefits were arguably gained with the rise of service-oriented computing (SOC) that established itself as one of the most important paradigms for distributed systems over the last decade. The implementation of enterprise-wide software landscapes in the style of service-oriented architecture (SOA) prioritizes loose coupling, encapsulation, interoperability, composition, and reuse. In recent years, microservices quickly gained in popularity as an agile, DevOps-focused, and decentralized service-oriented variant with fine-grained services. A key idea here is that small and loosely coupled services that are independently deployable should be easy to change and to replace. Moreover, one of the postulated microservices characteristics is evolutionary design.
Problem Statement: While these properties provide a favorable theoretical basis for evolvable systems, they offer no concrete and universally applicable solutions. As with each architectural style, the implementation of a concrete microservice-based system can be of arbitrary quality. Several studies also report that software professionals trust in the foundational maintainability of service orientation and microservices in particular. A blind belief in these qualities without appropriate evolvability assurance can lead to violations of important principles and therefore negatively impact software evolution. In addition to this, very little scientific research has covered the areas of maintenance, evolution, or technical debt of microservices.
Objectives: To address this, the aim of this research is to support developers of microservices with appropriate methods, techniques, and tools to evaluate or improve evolvability and to facilitate sustainable long-term development. In particular, we want to provide recommendations and tool support for metric-based as well as scenario-based evaluation. In the context of service-based evolvability, we furthermore want to analyze the effectiveness of patterns and collect relevant antipatterns. Methods: Using empirical methods, we analyzed the industry state of the practice and the academic state of the art, which helped us to identify existing techniques, challenges, and research gaps. Based on these findings, we then designed new evolvability assurance techniques and used additional empirical studies to demonstrate and evaluate their effectiveness. Applied empirical methods were for example surveys, interviews, (systematic) literature studies, or controlled experiments.
Contributions: In addition to our analyses of industry practice and scientific literature, we provide contributions in three different areas. With respect to metric-based evolvability evaluation, we identified a set of structural metrics specifically designed for service orientation and analyzed their value for microservices. Subsequently, we designed tool-supported approaches to automatically gather a subset of these metrics from machine-readable RESTful API descriptions and via a distributed tracing mechanism at runtime. In the area of scenario-based evaluation, we developed a tool-supported lightweight method to analyze the evolvability of a service-based system based on hypothetical evolution scenarios. We evaluated the method with a survey (N=40) as well as hands-on interviews (N=7) and improved it further based on the findings. Lastly with respect to patterns and antipatterns, we collected a large set of service-based patterns and analyzed their applicability for microservices. From this initial catalogue, we synthesized a set of candidate evolvability patterns via the proxy of architectural modifiability tactics. The impact of four of these patterns on evolvability was then empirically tested in a controlled experiment (N=69) and with a metric-based analysis. The results suggest that the additional structural complexity introduced by the patterns as well as developers' pattern knowledge have an influence on their effectiveness. As a last contribution, we created a holistic collection of service-based antipatterns for both SOA and microservices and published it in a collaborative repository.
Conclusion: Our contributions provide first foundations for a holistic view on the evolvability assurance of microservices and address several perspectives. Metric- and scenario-based evaluation as well as service-based antipatterns can be used to identify "hot spots" while service-based patterns can remediate them and provide means for systematic evolvability construction. All in all, researchers and practitioners in the field of microservices can use our artifacts to analyze and improve the evolvability of their systems as well as to gain a conceptual understanding of service-based evolvability assurance.
Polyelectrolyte multilayer coatings (PEM) are prepared by alternative layer-by-layer deposition of cationic and anionic polyelectrolyte monolayers on charged surfaces. The thickness of the coatings ranges from nm to few μm. Their properties such as roughness, stiffness, surface charge and surface energy can be precisely tuned to fulfil different technical or biological requirements. The coating process is based on self-assembly of polyelectrolytes. Advantages of these coatings are their easy handling, no harsh chemistry and the possibility for coatings on complex geometries. The PEM coatings can be prepared from a variety of suitable polyelectrolytes. Their stability varies from very durable PEM coatings that are only soluble in strong solvents to quickly degradable, which may be applied as drug release system. One example of such a degradable PEM system is the one based on the polyelectrolyte pair Hyaluronan (HA) and Chitosan (CHI). These biopolymers originate from natural sources and show low toxicity towards human cells. However, HA/CHI multilayers show only weak adhesiveness for human umbilical vein endothelial cells (HUVEC). In this article, we summarize our approaches to enhance the HA/CHI multilayer by incorporation of a non-polymer substance –graphene oxide– to improve the cell adhesion and keep such properties as low cytotoxicity and biodegradability. Different approaches for incorporation of graphene oxide were performed and the cellular adhesion was tested by metabolic assay.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Controlling the surface properties and structure of thin nanosized coatings is of primary importance in diverse engineering and medical applications. Here we report on how the nanostructure, growth mechanism, thickness, roughness, and hydrophilicity of nanocomposites composed of weak natural or strong synthetic polyelectrolytes (PE) can be tailored by graphene oxide (GO) doping. GO reverses the build‐up mechanism affecting the internal structure and the hydrophilicity in a way depending on the type of the PE‐matrix. The extent of GO‐adsorption and its impact on the surface morphology was found to be independent on the type of the underlying PE‐matrix. The nanostructure of the hybrid films is not significantly altered when a single surface‐exposed GO‐layer is deposited, while increasing the number of embedded GO‐layers leads to pronounced surface heterogeneity. These results are expected to have valuable impact on the construction strategies of coatings with tunable surface properties.
Drug-induced liver toxicity is one of the most common reasons for the failure of drugs in clinical trials and frequent withdrawal from the market. Reasons for such failures include the low predictive power of in vivo studies, that is mainly caused by metabolic differences between humans and animals, and intraspecific variances. In addition to factors such as age and genetic background, changes in drug metabolism can also be caused by disease-related changes in the liver. Such metabolic changes have also been observed in clinical settings, for example, in association with a change in liver stiffness, a major characteristic of an altered fibrotic liver. For mimicking these changes in an in vitro model, this study aimed to develop scaffolds that represent the rigidity of healthy and fibrotic liver tissue. We observed that liver cells plated on scaffolds representing the stiffness of healthy livers showed a higher metabolic activity compared to cells plated on stiffer scaffolds. Additionally, we detected a positive effect of a scaffold pre-coated with fetal calf serum (FCS)-containing media. This pre-incubation resulted in increased cell adherence during cell seeding onto the scaffolds. In summary, we developed a scaffold-based 3D model that mimics liver stiffness-dependent changes in drug metabolism that may more easily predict drug interaction in diseased livers.
Endogenous electrical fields play an important role in various physiological and pathological events. Yet the effects of electrical cues on processes such as wound healing, tumor development or metastasis are still rarely investigated, though it is known that direct current electrical fields can alter cell migration or proliferation in vitro. Several 2D experimental models for studying cell responses to direct current electrical fields have been presented and characterized but suitable experimental models for electrotaxis studies in 3D are rare. Here we present a novel, easy-to-produce, multi-well-based galvanotactic-chamber for the use in 2D and 3D cell experiments for investigations on the influence of electrical fields on tumor cell migration and tumor spheroid growth. Our presented system allows the simultaneous application of electrical field to cells in four chambers, either cultured on the bottom of the culture-plate (2D) or embedded in hydrogel filled channels(3D). The set-up is also suitable for, live-cell-imaging. Validation tests show stable electrical fields and high cell viabilities inside the channel. Tumor spheroids of various diameters can be exposed to direct current electrical fields up to one week.
Thermoplastic polycarbonate urethane elastomers (TPCU) are potential implant materials for treating degenerative joint diseases thanks to their adjustable rubber-like properties, their toughness, and their durability. We developed a water-containing high-molecular-weight sulfated hyaluronic acid-coating to improve the interaction of TPCU with the synovial fluid. It is suggested that trapped synovial fluid can act as a lubricant that reduces the friction forces and thus provides an enhanced abrasion resistance of TPCU implants. Aims of this work were (i) the development of a coating method for novel soft TPCU with high-molecular sulfated hyaluronic acid to increase the biocompatibility and (ii) the in vitro validation of the functionalized TPCUs in cell culture experiments.
Medical implants play a central role in modern medicine and both, naturally derived and synthetic materials have been explored as biomaterials for such devices. However, when implanted into living tissue, most materials initiate a host response. In addition, implants often cause bacterial infections leading to complications. Polyelectrolyte multilayer (PEM) coatings can be used for functionalization of medical implants improving the implant integration and reducing foreign body reactions. Some PEMs are also known to show antibacterial properties. We developed a PEM coating suggesting that it can decrease the risk of bacterial infections occurring after implantation while being highly biocompatible. We applied two different standard tests for evaluating the PEM’s antibacterial properties, the ISO norm (ISO 22196) and one ASTM norm (ASTM E2180) test. We found a reduction of bacterial growth on the PEM but to a different degree depending on the testing method. This result demonstrates the need for defining proper method to evaluate antibacterial properties of surface coatings.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
Methods based exclusively on heart rate hardly allow to differentiate between physical activity, stress, relaxation, and rest, that is why an additional sensor like activity/movement sensor added for detection and classification. The response of the heart to physical activity, stress, relaxation, and no activity can be very similar. In this study, we can observe the influence of induced stress and analyze which metrics could be considered for its detection. The changes in the Root Mean Square of the Successive Differences provide us with information about physiological changes. A set of measurements collecting the RR intervals was taken. The intervals are used as a parameter to distinguish four different stages. Parameters like skin conductivity or skin temperature were not used because the main aim is to maintain a minimum number of sensors and devices and thereby to increase the wearability in the future.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Globalization has increased the number of road trips and vehicles. The result has been an intensification of traffic accidents, which are becoming one of the most important causes of death worldwide. Traffic accidents are often due to human error, the probability of which increases when the cognitive ability of the driver decreases. Cognitive capacity is closely related to the driver’s mental state, as well as other external factors such as the CO2 concentration inside the vehicle. The objective of this work is to analyze how these elements affect driving. We have conducted an experiment with 50 drivers who have driven for 25 min using a driving simulator. These drivers completed a survey at the start and end of the experiment to obtain information about their mental state. In addition, during the test, their stress level was monitored using biometric sensors and the state of the environment (temperature, humidity and CO2 level) was recorded. The results of the experiment show that the initial level of stress and tiredness of the driver can have a strong impact on stress, driving behavior and fatigue produced by the driving test. Other elements such as sadness and the conditions of the interior of the vehicle also cause impaired driving and affect compliance with traffic regulations.
Comparison of sleep characteristics measurements: a case study with a population aged 65 and above
(2020)
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
This document presents a new complete standalone system for a recognition of sleep apnea using signals from the pressure sensors placed under the mattress. The developed hardware part of the system is tuned to filter and to amplify the signal. Its software part performs more accurate signal filtering and identification of apnea events. The overall achieved accuracy of the recognition of apnea occurrence is 91%, with the average measured recognition delay of about 15 seconds, which confirms the suitability of the proposed method for future employment. The main aim of the presented approach is the support of the healthcare system with the cost-efficient tool for recognition of sleep apnea in the home environment.
The ballistocardiography is a technique that measures the heart rate from the mechanical vibrations of the body due to the heart movement. In this work a novel noninvasive device placed under the mattress of a bed estimates the heart rate using the ballistocardiography. Different algorithms for heart rate estimation have been developed.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Mangels durchgängiger Datenstandards für Planungssysteme der Digitalen Fabrik müssen systemspezifische Datenaustauschlösungen implementiert werden. Zur Unterstützung der Planung ist ein durchgängiger Fabrikplanungsprozess mit integrierter Routenplanung sowohl prozess- als auch systemtechnisch erforderlich. Dafür werden beispielhaft ein Fabrik- und ein Routenplanungssystem auf ihre Kompatibilität untersucht, erforderliche Anforderungen abgeleitet und eine Datenaustausch-möglichkeit für den Anwender aufgezeigt.
Die pharmazeutische Verpackungsindustrie ist durch umfangreiche Regularien geprägt und daher in der Innovationsdynamik etwas eingeschränkt. In einem sechsmonatigen Projekt zur Entwicklung von Zukunftsszenarien für die Pharmaverpackung wurde aufgezeigt, dass zwar neue Technologien, wie E-Labels oder Kindersicherungen, die Marktreife erreicht haben oder in Kürze erreichen werden, neue Anforderungen in absehbarer Zukunft aber weiteren Entwicklungsbedarf erfordern. Die pharmazeutische Verpackungsindustrie muss sich zusammen mit ihren Kunden und Technologielieferanten enger und intensiver austauschen, um die nächste Verpackungsgeneration, Smart Packaging 2.0, auf den Weg zu bringen.
Die OLED-Technologie wurde vor über zehn Jahren als Revolution in der Verpackungs-industrie gefeiert, die jedoch in der Praxis ausblieb. In einem industriellen Kooperations-projekt zur Zukunftsszenarienentwicklung der pharmazeutischen Verpackungsindustrie stellt sich die OLED-Technologie als Schlüsseltechnologie für das Zukunftsszenario Smart Packaging 2.0 dar.
This article adopts a qualitative comparative causal mapping approach to extend knowledge of the interrelated barriers to public entrepreneurship and the outcomes of such entrepreneurship. The results highlight marked differences between the sales segment and the distribution grid segment of German public enterprises that should prompt a refined perspective on public entrepreneurship. Notably, besides intra-organizational barriers and those interfering from the external environment, results also show that a public enterprise’s supervisory board can hinder its progress. This study thus contributes to recent discussion on governance and entrepreneurship by revealing a feature that could distinguish public from private enterprises.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings—revenue-generating solutions to what customers want and are willing to pay for, inspired by what is possible with digital technologies. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This case describes how Munich Re addressed these common challenges by building a foundation to help its digital offerings succeed. The foundation provided prioritized and staged funding; dedicated, hands-on expertise; and a digital platform of shared services. By 2020, this foundation was helping to support over seventy initiatives, including several that were in the market generating new sources of revenue for the company by enabling its clients—insurance companies—to better service their own customers.
Durch die Digitalisierung der Arbeitswelt und die New Work-Bewegung verändert sich die Art und Weise, wie wir zu-ammenarbeiten. Entsprechend wird derzeit viel über Führung gesprochen – möglicherweise zu viel. Die zentrale These des Beitrags ist, dass wir besser verstehen können, wie Führungskräften und Mitarbeitende zukunftsorientiert zusam-menarbeiten, wenn wir weniger direkt über Führung reden, sondern zunächst von der Situation aus denken, in der sich Akteure koordinieren. Führung wird dann nicht isoliert, sondern zusammen mit verschiedenen Koordinationsformen betrachtet (wie Autonomie, Selbstorganisation, Management und eben Führung). Das kann in der Praxis helfen, den Wandel der Zusammenarbeit – inklusive der neuen Art von Führung – reflektierter und zielorientierter zu gestalten.
In modernen Arbeitswelten werden zunehmend arbeitsplatzbezogene digitale Technologien eingesetzt. Wenngleich dies zahlreiche Chancen bietet, kann es auch negative Folgen für die Gesundheit von Mitarbeitenden haben. Diese Herausforderungen werden durch die aktuelle Corona-Krise für viele Unternehmen noch verschärft. Stress, der direkt oder indirekt durch den Einsatz von Technologien entsteht, wird als «Technostress» bezeichnet. Wichtige Hebel zu dessen Vermeidung umfassen die Gestaltung von Technologien sowie die Berücksichtigung verschiedener individueller und situativer Faktoren im Rahmen technologischer Veränderungsprozesse.
Fehler, Manipulation und Rationalität – wie das Reporting das Verhalten der Entscheider beeinflusst
(2020)
Der Zweck des Management Reporting besteht darin, den Informationsbedarf der Führungskräfte zu befriedigen. Sowohl Ersteller als auch Nutzer von Berichten handeln aber nur begrenzt rational. Berichte wirken deshalb nicht „zielgenau“, sondern lösen vielfältige nicht gewünschte Reaktionen bei den Beteiligten aus. In diesem Beitrag erfahren Sie, wie sich „der Faktor Mensch“ auf die Erstellung und Nutzung von Management Reports auswirkt und wie ein effektives und effizientes Management Reporting unerwünschte Wirkungen minimieren kann.
Problem: Immer mehr Unternehmen führen Lean-Prinzipien ein, finden ihre Anforderungen an passende Kosteninformation aber von der traditionellen Kostenrechnung nicht ausreichend abgedeckt.
Ziel: Eine am Lean-Gedanken orientierte Kostenrechnung baut neue Kostenzurechnungsobjekte ein und stellt bisher vernachlässigte Kosteninformationen zur Verfügung
Methode: Gängige Kostenrechnungsansätze werden einem geschlossenen “accounting for lean” Ansatz gegenübergestellt, Gemeinsamkeiten und Überschneidungen aufgezeigt.
Cloud resources can be dynamically provisioned according to application-specific requirements and are payed on a per-use basis. This gives rise to a new concept for parallel processing: Elastic parallel computations. However, it is still an open research question to which extent parallel applications can benefit from elastic scaling, which requires resource adaptation at runtime and corresponding coordination mechanisms. In this work, we analyze how to address these system-level challenges in the context of developing and operating elastic parallel tree search applications. Based on our findings, we discuss the design and implementation of TASKWORK, a cloud-aware runtime system specifically designed for elastic parallel tree search, which enables the implementation of elastic applications by means of higher-level development frameworks. We show how to implement an elastic parallel branch-and-bound application based on an exemplary development framework and report on our experimental evaluation that also considers several benchmarks for parallel tree search.
High Performance Computing (HPC) enables significant progress in both science and industry. Whereas traditionally parallel applications have been developed to address the grand challenges in science, as of today, they are also heavily used to speed up the time-to-result in the context of product design, production planning, financial risk management, medical diagnosis, as well as research and development efforts. However, purchasing and operating HPC clusters to run these applications requires huge capital expenditures as well as operational knowledge and thus is reserved to large organizations that benefit from economies of scale. More recently, the cloud evolved into an alternative execution environment for parallel applications, which comes with novel characteristics such as on-demand access to compute resources, pay-per-use, and elasticity. Whereas the cloud has been mainly used to operate interactive multi-tier applications, HPC users are also interested in the benefits offered. These include full control of the resource configuration based on virtualization, fast setup times by using on-demand accessible compute resources, and eliminated upfront capital expenditures due to the pay-per-use billing model. Additionally, elasticity allows compute resources to be provisioned and decommissioned at runtime, which allows fine-grained control of an application's performance in terms of its execution time and efficiency as well as the related monetary costs of the computation. Whereas HPC-optimized cloud environments have been introduced by cloud providers such as Amazon Web Services (AWS) and Microsoft Azure, existing parallel architectures are not designed to make use of elasticity. This thesis addresses several challenges in the emergent field of High Performance Cloud Computing. In particular, the presented contributions focus on the novel opportunities and challenges related to elasticity. First, the principles of elastic parallel systems as well as related design considerations are discussed in detail. On this basis, two exemplary elastic parallel system architectures are presented, each of which includes (1) an elasticity controller that controls the number of processing units based on user-defined goals, (2) a cloud-aware parallel execution model that handles coordination and synchronization requirements in an automated manner, and (3) a programming abstraction to ease the implementation of elastic parallel applications. To automate application delivery and deployment, novel approaches are presented that generate the required deployment artifacts from developer-provided source code in an automated manner while considering application-specific non-functional requirements. Throughout this thesis, a broad spectrum of design decisions related to the construction of elastic parallel system architectures is discussed, including proactive and reactive elasticity control mechanisms as well as cloud-based parallel processing with virtual machines (Infrastructure as a Service) and functions (Function as a Service). To evaluate these contributions, extensive experimental evaluations are presented.
The objective of the project presented here is to develop an intelligent control algorithm for an energy system consisting of a biogas CHP (combined heat and power), various storage technologies, such as thermal energy storages (TES) and gas storages, and other renewable energy sources, such as photovoltaics. A corresponding algorithm based on the Monte-Carlo method has already been developed at Reutlingen University for CHP units running on natural gas and for heat pumps. The project presented here concentrates on the further development of this algorithm for application to biogas CP units. In this context, an adequate implementation of the gas storage is of primary importance, as it mainly determines the flexibility of the plant. In the course of the validation of the new optimization algorithm, simulations were carried out based on data from the Lower Lindenhof, an agricultural experimental station of the University of Hohenheim. Both an optimization with regard to onsite electricity utilization and an optimization driven by residual load were investigated. Preliminary results show that the optimization algorithm can improve the operation of the biogas CHP unit depending on the selected target function.
Unter dem Begriff Innovation Enabling wird im Folgenden ein Konzept für die ganzheitliche Unterstützung interdisziplinärer Teams beim kreativen und innovativen Problemlösen vor-gestellt. Dieses Konzept unterstützt Moderatoren und Teilnehmergleichermaßen und ein damit realisiertes System bleibt durch die implizite Interaktion für den Nutzer im Hintergrund. Eine zentrale Rolle spielt das Konzept der Awareness Pipeline zur Implementation einer impliziten Interaktion auf Basis eines Sensor-Aktor-Systems, welches in diesem Artikel vorgestellt wird. Die Unterstützung der begleitenden Moderations- und Administrationsaufgaben, wie beispielsweise der automatisierten Dokumentation der Sitzung, sollen in Zukunft einen deutlichen Mehrwert gegenüber einer klassischen Brainstorming-Sitzung bieten.
The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2020) continued a series of events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Lean Management hat in viele Unternehmen Einzug gehalten. Lean Konzepte stellen neue Anforderungen an die Art und Struktur der benötigten Kosteninformation, welche von traditionallen Kostenrechnungssystemen nicht unmittelbar erfüllt werden. Vertreter eines „Lean Accounting“ schlagen deshalb teils radikale Änderungen und eine Vereinfachung der Kostenrechnung vor. Der Beitrag diskutiert die Beschränkungen der traditionellen Kostenrechnung bei der Umsetzung von Lean Management und stellt ausgewählte Ansätze eines „Accounting for Lean“ vor. Die Analyse zeigt, dass Ansätze des Lean Accounting zu eng fokussiert sind und die in der Praxis vorhandene Pluralität der Kostenrechnungsfunktionen nicht adäquat abbilden können. Eine radikale Neugestaltung bestehender Kostenrechnungssysteme wird deshalb als unrealistisch und unbegründet verworfen. Der Beitrag entwickelt alternative Vorschläge, wie Konzepte des Lean Managements und die dafür benötigte Kosteninformation in traditionellen Kostenrechnungssystemen integriert werden können.
Industrielle Produktionseinrichtungen haben mit rund 40% einen signifikanten Anteil am Gesamtenergiebedarf in Deutschland. Daher wurden und werden sie sowohl technologisch als auch energetisch optimiert. Häufig geht die technologischwirtschaftliche Optimierung auch mit der Reduzierung des Energie und Materialverbrauchs einher. Zudem macht der Ausbau der regenerativen Energiequellen die Energieerzeugung zunehmend volatiler, sodass nicht nur die Senkung des absoluten Energieverbrauchs, sondern auch eine höhere Flexibilität (Steuerung der Leistung über der Zeit) zunehmend interessanter wird. Dadurch ändert sich oft die installierte Leistung sowie die Gestaltung der Verlustleistungsabfuhr, was die Dimensionierung von Anlagen, zum Beispiel von spanenden Werkzeugmaschinen, beeinflusst.
A holistic approach to digitization enables decision-makers to achieve new efficiency in corporate performance management. The digitalization improves the quality, validity and speed of information retrieval and processing. At present, most corporations are confronted with the problem of not being able to organize, categorize and visualize decision-relevant information. To meet the challenges of information management, the Management Cockpit provides an information center for managers. In accordance with the specific working environment of the executives, the Management Cockpit offers a quick and comprehensive overview of the company's situation. Today, the current situation of a company is no longer only influenced by internal factors, but also by its public image. Social media monitoring and analysis is therefore a crucial component for the external factors of successful management. Real-time monitoring of the emotions and behaviors of consumers and customers thus contributes to effective controlling of allbusiness areas. The intelligent factories promise to collect data for internal factors, but the current reality in manufacturing looks different. Production often consists of a large number of different machines, with varying degrees of digitization and limited sensor data availability. In order to close this gap, we developed a compact sensor board with network components, which allows a flexible design with different sensors for a wide variety of applications. The sensor data enable decision makers to adapt the supply chain based on their internal and external observations in the Management Cockpit. Due to the realtime and long-term monitoring and analytic possibilities the Management Cockpit provides a multi-dimensional view of the company and supports an holistic Corporate Performance Management.
The article studies a novel approach of inflation modeling in economics. We utilize a stochastic differential equation (SDE) of the form dXt=aXtdt+bXtdBtH, where dBtH is a fractional Brownian motion in order to model inflationary dynamics. Standard economic models do not capture the stochastic nature of inflation in the Eurozone. Thus, we develop a new stochastic approach and take into consideration fractional Brownian motions as well as Lévy processes. The benefits of those stochastic processes are the modeling of interdependence and jumps, which is equally confirmed by empirical inflation data. The article defines and introduces the rules for stochastic and fractional processes and elucidates the stochastic simulation output.
Resilienz und Stabilität? Weichenstellungen im Banken- und Finanzsystem in der Corona-Pandemie
(2020)
Seit der globalen Finanzkrise 2008/2009 hat es keine vergleichbare Herausforderung wie die Corona-Krise für das Finanz- und Bankensystem mehr gegeben.
Schwache Profitabilität, ungelöste Regulierungs-herausforderungen und steigende Konkurrenz im Digitalbereich stellen die Banken vor weitere Heraus-forderungen.
Die Stabilität des Finanzsystems und der Zugang zu den Finanzmärkten war während der Pandemie nicht gefährdet. Durch gemeinsame Bemühungen und bes-sere Bankenkapitalisierung ist das Finanzsystem heute widerstandsfähiger als zu Zeiten der Finanzkrise.
Sofern die Zuschüsse und Kredite im „Next Genera-tion EU“-Fund zielgerichtet für Strukturreformen und Zukunftsinvestitionen eingesetzt werden, dürfte dies einen Vertrauens- und Wachstumsimpuls darstellen.
Weitere Verbesserungen der Finanzstabilität, wie erhöhte Eigenkapitalunterlegungen, Regulierung von Schattenbanken oder Reformen im Bereich der Finanzaufsicht, sind jedoch von Nöten.
Since the global financial crisis of 2008/2009, there has been no challenge to the financial and banking system comparable to that during the Corona crisis.
Weak profitability, unresolved regulatory challenges and increasing competition in the digital sector pose further challenges for banks.
The stability of the financial system and access to financial markets was not at risk during the pandemic. Through joint efforts and better bank capitalisation, the financial system is now more resilient than during the financial crisis.
Provided that grants and loans in the “next generation EU” fund are well targeted for structural reforms and investments in the future, this should boost confi-dence and growth.
However, further improvements in financial stability, such as increased capital requirements, regulation of shadow banks or reforms in financial supervision, are needed.
This paper studies the impact of financial liquidity on the macro-economy. We extend a classic macroeconomic modeland compute numerical simulations. The model confirms that persistently low inflation can occur despite a high degreeof financial liquidity due to a reallocation of cash, normal and risk-free bonds. In that regard, our model uncovers anexplanation of a flat Phillips curve. Overall, our approach contributes to a rather disregarded matter in macroeconomictheory.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
The advent of chatbots in customer service solutions received increasing attention by research and practice throughout the last years. However, the relevant dimensions and features for service quality and service performance for chatbots remain quite unclear. Therefore, this research develops and tests a conceptual model for customer service quality and customer service performance in the context of chatbots. Additionally, the impact of the developed service dimensions on different customer relationship metrics is measured across different service channels (hotline versus chatbots). Findings of six independent studies indicate a strong main effect of the conceptualized service dimensions on customer satisfaction, service costs, intention to service reusage, word-of-mouth, and customer loyalty. However, different service dimensions are relevant for chatbots compared to a traditional service hotline.
Autonomous driving is becoming the next big digital disruption in the automotive industry. However, the possibility of integrating autonomous driving vehicles into current transportation systems not only involves technological issues but also requires the acceptance and adoption of users. Therefore, this paper develops a conceptual model for user acceptance of autonomous driving vehicles. The corresponding model is tested through a standardized survey of 470 respondents in Germany. Finally, the findings are discussed in relation to the current developments in the automotive industry, and recommendations for further research are given.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? And (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? and (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
Im Rahmen des Forschungsprojektes wurden Ausrüstungsmittel und -verfahren entwickelt, die dem vorbeugenden Schutz von Textilien (insbesondere Bodenbelägen) vor Anschmutzung dienen. Das Verfahren sieht eine kombinierte Ausrüstung von Textilien mit fluorierten Polymeren mit inkorporierten Nanopartikeln (in erster Linie: SiO2) zur Erhöhung der Rauhigkeit vor. Es wurden kommerziell erhältliche Hydrophobiermittel (Fluorcarbon- oder Kohlenwasserstoff-basierte Polymere) in Kombination mit SiO2-Nanopartikeln auf Teppiche aufgebracht und hinsichtlich eines Anschmutzens - z.B. durch Kaffee, KoolAid, Rotwein, AATCC Standard Soil, schwarze Schuhcreme - untersucht. Hierzu wurden die scherempfindlichen Dispersionen der Hydrophobiermittel mit neu entwickelten angepassten Dispersionen von SiO2-Nanopartikel versetzt. Die SiO2-Nanopartikel wurden mit systematisch variierten Größen von 10-1.000 nm synthetisiert, umfassend charakterisiert und mit Hilfe von neu entwickelten Fluormethacrylat-Copolymeren mit reaktiven Gruppen (Maleinsäure-, Itaconsäure- oder Citraconsäureanhydrid) und hydrophilen Modifiern (Alkohol- oder Amingruppen) stabilisiert. Die resultierenden Polymer-Teilchen-Dispersionen konnten aus wässrigen oder ethanolisch-wässrigen Lösungen auf Textilien (PA-, PES- oder WO-Teppiche und -Gewebe) appliziert werden. Weiterhin wurden auch die neu entwickelten Fluorcarbon-Polymere hinsichtlich ihrer Anwendung getestet. In Anschmutzungsversuchen wiesen die so ausgerüsteten Teppiche ein geringeres Anschmutzen durch Standardschmutz als Referenzmaterialien auf. Die Beständigkeit der Ausrüstung bei mechanischer Belastung konnte durch Vernetzung der Polymere auf dem Textilmaterial verbessert werden. Für PA 6- und PA 6.6-Teppiche wurden die besten Ergebnisse hinsichtlich eines geringeren Anschmutzens durch wasserlösliche Verschmutzungen (Kaffee, Rotwein, KoolAid) im Vergleich zu unbehandelten Teppichen ermittelt, wenn die Ausrüstung mit Fluorpolymer-stabilisierten SiO2-Nanopartikeln oder mit einer kombinierten Dispersion aus SiO2-Partikeln und Fluorcarbonharzen vorgenommen wurde. Eine im Vergleich zu unbehandelten Teppichen weniger starke Anschmutzung durch AATCC Standard Soil (DIN EN ISO 11378-2) wurde für mit SiO2-Partikeln behandelte PA 6-Teppiche ermittelt. Hydrophobe Anschmutzungen (z.B. schwarze Schuhcreme) konnten von mit Fluorcarbon-Polymeren ausgerüsteten Teppichen am besten entfernt werden. Die Kombination von SiO2-Partikeln mit Fluorcarbon-Polymeren erwies sich meist als günstiger als die alleinige Behandlung mit Fluorcarbonharzen. Ein Zusammenhang zwischen der Größe der Nanopartikel, der Abrasionsbeständigkeit und den Reinigungseigenschaften wurde festgestellt, und es konnte gezeigt werden, dass FC-Nanopartikel-Composites diese verbessern. Die mechanische Beständigkeit der Antischmutzausrüstung mit SiO2-Nanopartikeln und Fluorcarbon-Polymeren auf Polyamidteppichen wurde z.B. durch Hexapod-Trommelbeanspruchung (nach ISO 10361) geprüft. Durch REM, IR-Spektroskopie und den Wassertropfentest wurde nach 4.000 und auch nach 12.000 Touren noch eine intakte Beschichtung nachgewiesen. Mit Vernetzern, die das Polymer selbst, das Polymer mit Partikeln und/oder der Substratoberfläche vernetzen, konnte z.T. die Abrasionsbeständigkeit verbessert werden (hier müssen ggf. optimalere Vernetzer gesucht werden).