Ja
Refine
Year of publication
- 2020 (152) (remove)
Document Type
- Journal article (77)
- Conference proceeding (48)
- Report (9)
- Book chapter (5)
- Doctoral Thesis (4)
- Book (3)
- Working Paper (3)
- Issue of a journal (2)
- Anthology (1)
Is part of the Bibliography
- yes (152)
Institute
- Informatik (58)
- ESB Business School (43)
- Life Sciences (25)
- Technik (19)
- Texoversum (6)
Publisher
- Hochschule Reutlingen (20)
- Elsevier (18)
- MDPI (13)
- De Gruyter (7)
- Gesellschaft für Informatik e.V (6)
- AMD Akademie Mode & Design (5)
- Springer (5)
- Deutsches Textilforschungszentrum Nord-West (4)
- MIM, Marken-Institut München (4)
- IARIA (3)
Customer orientation should be the core engine of every organisation. Information technology can be considered as the enabler to generate competitive advantages through customer processes in marketing, sales and service. The impact of information technologies is the biggest risk and at the same time a huge opportunity for any organisation. Research shows that Customer Relationship Management (CRM) enables organisations to perform better and focus more on their customers (e.g. market capitalisation of Amazon). While global enterprises are shaping the future of customer centricity and information technology, the question arises how German B2B organisations can shift their value contribution from product-centric to customer-centric. Therefore, these organisations are attempting to implement CRM software and putting their customers more into focus. However, the question remains, how organisations are approaching the implementation of CRM and if these attempts are paying off in terms of business performance.
Contributing to this highly topical discussion, this thesis contributes to the body of knowledge about the implementation of CRM in the German B2B sector and how it impacts their business performance. First, theoretical frameworks have been developed based on an extensive literature review. Hereby different aspects of CRM are worked-out and mapped against three dimensions of business performance, namely process efficiency, customer satisfaction and financial performance. Based on the theory, a conceptual framework was developed to test the relationships between CRM and Business Performance (BP). Therefore, a survey with 500 participants has been conducted. Based on this a measurement model was developed to test five main hypotheses.
The findings of these hypotheses suggest, that the implementation of CRM positively impacts business performance. In specific, the usage of analytical CRM and the establishment of a dedicated CRM success measurement correlate with the performance of German B2B organisations. In addition to these main findings, various key statements could be derived from the research and a measurement model was developed, which can be used for different organisational characteristics assessing BP. As a result, CRM implementations can be enhanced, and business performance can be improved.
Der ›AM Field Guide‹ gibt einen ersten strukturierten Überblick in die komplexe und vielschichtige Welt der additiven Fertigungsverfahren. Getrennt nach den Materialien Polymere, Metalle und weitere Materialien werden die gängigsten jeweils auf dem Markt angebotenen AM-Verfahren schematisch dargestellt und der Verfahrenskern in Kurzform beschrieben. Neben den hier dargestellten Hauptverfahren gibt es viele Derivate und Sonderverfahren, die ebenfalls ein gesetzt werden jedoch nicht explizit gezeigt sind. Zu beachten ist, dass in dem jungen Bereich der additiven Fertigung viele Hersteller ihren AM-Anwendungen eigene Namen geben, so dass eine allgemeingültige umfassende Klassifizierung nur ansatzweise erreichbar ist.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
The development of new materials that mimic cartilage and its function is an unmet need that will allow replacing the damaged parts of the joints, instead of the whole joint. Polyvinyl alcohol (PVA) hydrogels have raised special interest for this application due to their biocompatibility, high swelling capacity and chemical stability. In this work, the effect of post-processing treatments (annealing, high hydrostatic pressure (HHP) and gamma-radiation) on the performance of PVA gels obtained by cast-drying was investigated and, their ability to be used as delivery vehicles of the anti-inflammatories diclofenac or ketorolac was evaluated. HHP damaged the hydrogels, breaking some bonds in the polymeric matrix, and therefore led to poor mechanical and tribological properties. The remaining treatments, in general, improved the performance of the materials, increasing their crystallinity. Annealing at 150 °C generated the best mechanical and tribological results: higher resistance to compressive and tensile loads, lower friction coefficients and ability to support higher loads in sliding movement. This material was loaded with the anti-inflammatories, both without and with vitamin E (Vit.E) or Vit.E + cetalkonium chloride (CKC). Vit.E + CKC helped to control the release of the drugs which occurred in 24 h. The material did not induce irritability or cytotoxicity and, therefore, shows high potential to be used in cartilage replacement with a therapeutic effect in the immediate postoperative period.
Purpose. To improve the efficiency of the closed-cycle operation of the field-orientation induction machine in dynamic behavior when load conditions are changing, considering the nonlinearities of the main inductance.
Methodology. The optimal control problem is defined as the minimization of the time integral of the energy losses. The algorithm observed in this paper uses the Matlab/Simulink, dSPACE real-time interface, and C language. Handling real-time applications is made in ControlDesk experiment software for seamless ECU development.
Findings. Adiscrete-time model with an integrated predictive control scheme where the optimization is performed online at every sampling step has been developed. The optimal field-producing current trajectory is determined, so that the copper losses are minimized over a wide operational range. Additionally, the comparison of measurement results with conventional methods is provided, which validates the advantages and performance of the control scheme.
Originality. To solve the given problem, the information vector on the current state of the coordinates of the electromechanical system is used to form a controlling influence in the dynamic mode of operation. For the first time, the formation process of controls has considered the current state and the desired future state of the system in the real-time domain.
Practical value. Apredictive iterative approach for optimal flux level of an induction machine is important to generate the required electromagnetic torque and to reduce power losses simultaneously.
The planning and control of intralogistics systems in line with versatile production systems of smart factories requires new approaches and methods to cope with changing requirements within future factories. The planning of intralogistics can no longer follow a static, sequential approach as in the past since the planning assumptions are going to change in a high frequency. Reasons for these constant changes are amongst others external turbulences like rapidly changing market conditions, decreasing batch sizes down to customer-specific products with a batch size of one and on the other hand internal turbulences (like production and logistic resource breakdowns) affecting the production system. This paper gives an insight into research approaches and results how capabilities of intelligent logistical objects (intelligent bins, autonomous transport systems etc.) can be used to achieve a self-organized, cost and performance optimized intralogistics system with autonomously controlled process execution within versatile production environments. A first consistent method has been developed which has been validated and implemented within a scenario at the pilot factory Werk150 at the ESB Business School (Reutlingen University). Based on the incoming production orders, the method of the Extended Profitability Appraisal (EPA) covering the work system value to define the most effective work system for order fulfilment is applied. To derive the appropriate intralogistics processes, an autonomous control method involving principles of decentralized and target-oriented decision-making (e.g. intelligent bins are interacting with autonomously controlled transport systems to fulfil material orders of assembly workstations) has been developed and applied to achieve a target-optimized process execution. The results of the first stage research using predefined material sources and sinks described in this paper is going to set the basis for the further development of a self-organized and autonomously controlled method for intralogistics systems considering dynamic source and sink relations. By allowing dynamic shifts of production orders in the sense of dynamic source and sink relations the cost and performance aims of the intralogistics system can be directly aligned with the aims of the entire versatile production system in the sense of self-organized and autonomously controlled systems.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory.
Learning factories can complement each other by training different competencies in the field of digitalisation and Industry 4.0. They depict diverse sections of the product development process and focus on various technologies. Within the framework of the International Association of Learning Factories (IALF), the operating organisations of learning factories exchange information on research, training and education. One of the aims is to develop joint projects. The article presents different concepts of cooperation between learning factories while focusing on the improvement of the development of learners competencies e.g. with a broader range of topics. A concept of a joint course between the learning factories in Bochum, Reutlingen and Darmstadt is explained in detail. The three learning factories will be examined with regard to their similarities and differences. The joint course focuses on the target group of students and the topic of digitalisation in the development and production of products. The course and its contents are explained in detail. The new learning approach is evaluated on the basis of feedback from the participants. Finally, challenges resulting from the cooperation between learning factories at different locations and with different operating models will be discussed.
This paper presents a novel emulation concept for the test of smart contracts and Distributed Ledger Technologies (DLT) in distribute control or energy economy tasks and use cases. The concept uses state of the art behavioral modeling tools such as Matlab Simulink but presents a possible way to solve the shortfall of Simulink in communicating to DLT-Nodes directly. This is solved through a middleware solution. After this, an example used in verifying the test bed is presented and the target demonstration object is described. Finally, the possible expansion of the system is discussed and presented.
This paper aims at presenting a solution that enables end customers of the energy system to participate in new local micro-energy-markets by providing them with a distributed, decentralized, transparent and secure Peer to Peer (P2P) payment system, which functions automatically applying new concepts of Machine to Machine (M2M) communication technologies. This work was performed within the German project VK_2G, funded by the DBU. The key results were: Providing means to perform microtransactions in a P2P fashion between end consumers and prosumers in local communities at low cost in a transparent and secure manner; Developing a platform with pre-defined smart contracts able to be tailored to different end customers ‘needs in an easy way and; Integrating both the market platform as well as the local control of generation and loads. This solution has been developed, integrated and tested in a laboratory prototype. This paper discusses this solution and presents the results of the first test.
Facial expressions play a dominant role in facilitating social interactions. We endeavor to develop tactile displays to reinstate facial expression modulated communication. The high spatial and temporal dimensionality of facial movements poses a unique challenge when designing tactile encodings of them. A further challenge is developing encodings that are at-tuned to the perceptual characteristics of our skin. A caveat of using vibrotactile displays is that tactile stimuli have been shown to induce perceptual tactile aftereffects when used on the fingers, arm and face. However, at present, despite the prevalence of waist-worn tactile displays, no such investigations of tactile aftereffects at the waist region exist in the literature, though they are warranted by the unique sensory and perceptual signalling characteristics of this area. Using an adaptation paradigm we investigated the presence of perceptual tactile aftereffects induced by continuous and burst vibrotactile stimuli delivered at the navel, side and spinal regions of the waist. We report evidence that the tactile perception topology of the waist is non-uniform, and specifically that the navel and spine regions are resistant to adaptive aftereffects while side regions are more prone to perceptual adaptations to continuous but not burst stimulations. Results of our current investigations highlight the unique set of challenges posed by designing waist-worn tactile displays. These and future perceptual studies can directly inform more realistic and effective implementations of complex high-dimensional spatiotemporal social cues.
Deutschland, quo vadis?
(2020)
Shutdown in Deutschland im März 2020. Stillstand in Handel und Industrie. Der Börsenwert einer beachtlichen Anzahl von Unternehmen hat sich in kürzester Zeit halbiert. Anleger warfen alles auf den Markt. Und bei der hohen Unsicherheit verloren sämtliche Anlageklassen, zeitweise sogar Gold. Selbst Konzerne wie die Lufthansa werden es ohne Staatshilfe nicht mehr schaffen zu existieren.
Im Projekt GalvanoFlex_BW sind verschiedene Möglichkeiten zur Verbesserung der Energieeffizienz von energieintensiven Industrieunternehmen aufgezeigt und untersucht worden. Die Einführung der KWK im stromoptimierten Betrieb stellte dabei einen besonders betrachteten Aspekt dar. Neben der technischen Untersuchung ist zudem eine sozialwissenschaftliche Betrachtung vorgenommen worden, um die Einführung und Umsetzung entsprechender Maßnahmen auch unter diesem Aspekt zu betrachten. Ein zusätzlicher wichtiger Schwerpunkt des Projektes war die Übertragung des erarbeiteten Wissens an weitere Unternehmen, Institutionen etc., die nicht direkt am Projekt beteiligt waren.
Durch die Zusammenarbeit von vier Forschungs- und drei Industriepartnern sind die Arbeiten praxisorientiert auf Basis realer Messdaten sowie im Zuge von Befragungen der handelnden Personen bei den Projektpartnern im Sinne eines Reallabors durchgeführt worden.
Die Notwendigkeit zur Einsparung von Energie und damit zur Umsetzung von Effizienzmaßnahmen ist ein wichtiger Schritt, um die von der EU geplante Reduktion von Treibhausgasen zu erreichen. Darüber hinaus ist zu erwarten, dass die Energiekosten auch zukünftig weiter ansteigen. Dadurch werden Maßnahmen zur Steigerung der Energieeffizienz zu einem wichtigen Element, eine wettbewerbsfähige Produktion zu gewährleisten. Im Rahmen des Projektes sind deshalb verschiedene Energieeffizienzmaßnahmen speziell für die Galvanotechnik recherchiert, analysiert und in einen Maßnahmenkatalog überführt worden. Des Weiteren wurde eine Bewertungsmethode entwickelt, die die Unternehmen der Galvanotechnik bei der Identifikation von sinnvollen Energieeffizienzmaßnahmen unterstützen soll.
Bei den Untersuchungen zur Umsetzung von Kraft-Wärme-Kopplungsanlagen konnte am Beispiel von zwei Unternehmen mit stark unterschiedlichen Strom- und Wärmebedarfswerten gezeigt werden, dass der Einsatz entsprechender Anlagen wirtschaftlich lohnenswert ist. Im günstigsten Fall ergeben sich Amortisationszeiten von etwa zwei Jahren. Es hat sich weiterhin gezeigt, dass die Auslegung des Block-heizkraftwerkes stark von den Strom- und Wärmebedarfswerten abhängt und dass der Pufferspeicher keinesfalls zu klein ausgelegt werden sollte. Eine intelligente stromoptimierte Steuerung mit Lastspitzenmanagement kann die Wirtschaftlichkeit jedoch häufig nur noch wenig gegenüber dem wärmegeführten Betrieb verbessern. Ursache dafür ist, dass das derzeitige Förder- und Vergütungssystem für Kraft-Wärme-Kopplungsanlagen nahezu keine Anreize für einen am Strombedarf und damit an der Deckung von Residuallast orientierten Betrieb bietet. Einzig bei Unternehmen mit ausgeprägten Spitzen im Strombezug, kann der gezielte Einsatz eines Blockheizkraftwerkes zu einer signifikanten Senkung des Leistungspreises führen.
Darüber hinaus ist die Einführung einer Kraft-Wärme-Kopplungsanlage generell als komplexe Energieeffizienzmaßnahme anzusehen, die deshalb neben den wirtschaftlichen Aspekten erhöhte Anforderungen an die Unternehmen und das professionelle Umfeld stellt, die im Rahmen der sozialwissenschaftlichen Begleitforschung untersucht worden sind. Dabei konnten Treiber aber auch Hemmnisse zur Umsetzung der Technologie sowohl innerhalb der Unternehmen als auch außerhalb identifiziert werden. Die internen Hemmnisse sind dabei auf unterschiedliche Ursachen zurückzuführen, wie die hohe Komplexität der KWK-Technologie, die schwierige Bewertung des Gesamtnutzens im Unternehmen, die mangelnde personelle Ausstattung und fehlende Unternehmerentscheidungen. Abhilfe schaffen, und damit als Treiber wirken, können hier ein verbessertes Beratungsangebot insbesondere seitens neutraler Stellen sowie der Anlagenbetrieb im Contracting.
Die Übertragung der im Projekt erarbeiteten Ergebnisse ist bereits während der Projektlaufzeit über die Branchenplattform im Zuge verschiedener Workshops, die speziell auf Unternehmen und Institutionen außerhalb des Projektkonsortiums ausgerichtet waren, erfolgt. Zur Verstärkung der Verbreitung des erarbeiteten Wissens ist zum Projektende eine Serie aus vier Fachartikeln in einem namhaften Branchenmagazin erschienen, und es ist eine Internetseite zum Projekt erstellt worden (www.galvanoflex_bw.de). Letztere hat dabei nicht nur die Aufgabe der Wissensverbreitung, sondern sie soll auch über das Projektende hinaus als Kontaktplattform dienen, um die Umsetzung von Effizienzmaßnahmen mit dem im Projekt generierten Wissen zu unterstützen.
Automatic anode rod inspection in aluminum smelters using deep-learning techniques: a case study
(2020)
Automatic fault detection using machine learning has become an exciting and promising area of research. This because it accurate and timely way to manage and classify with minimal human effort. In the computer vision community, deep-learning methods have become the most suitable approaches for this task. Anodes are large carbon blocks that are used to conduct electricity during the aluminum reduction process. The most basic function of anode rod inspection is to prevent a situation where the anode rod will not fit into the stub-holes of a new anode. It would be the case for a rod containing either severe toe-in, missing stubs, or a retained thimble on one or more stubs. In this work, to improve the accuracy of shape defect inspection for an anode rod, we use the Fast Region-based Convolutional Network method (Fast R-CNN), model. To train the detection model, we collect an image dataset composed of multi-class of anode rod defects with annotated labels. Our model is trained using a small number of samples, an essential requirement in the industry where the number of available defective samples is limited. It can simultaneously detect multi-class of defects of the anode rod in nearly real-time.
Sol-Gel basierte Flammschutzmittel stellen einen vielversprechenden Ansatz für Textilien dar, gerade im Bereich des Ersatzes von derzeit etablierten halogenhaltigen Flammschutzmitteln. Letztere sind aufgrund ihrer toxikologisch Bedenklichkeit sowie ihrer mitunter bioakkumulierenden Eigenschaften in die Kritik geraten. In diesem Forschungsvorhaben wurde daher untersucht wie aus Phosphor- und stickstoffhaltige Silane halogenfreie Flammschutzmittel verwirklicht werden können. Die Sol-Gel-Schicht fungierte dabei zum einen als nicht brennbarer Binder, zum anderen konnten über das Anbinden von Phosphorgruppen in an kommerziell verfügbare Silane Flammschutz aktive Gruppen direkt mit eingebunden werden. Verschiedene Syntheseansätze wurden dabei verfolgt, wobei durch alle hergestellten N-P-Silane ein Flammschutz nach DIN EN ISO 15025 (Schutzkleidung – Schutz gegen Hitze und Flammen) erhalten wurden. Dabei hängt die Flammschutzwirkung stark von den funktionellen Gruppen und der Oxidationsstufe des Phosphors ab, dabei konnte ein entsprechender Flammschutz bei Auflagen von 5 % erzielt werden. Es konnte gezeigt werden, dass ein Mechanismus auf Basis der Bildung einer Schutzschicht hauptsächlich verantwortlich für den Flammschutz ist. Dieses Ergebnis ist für eine zukünftige, weitere Optimierung entsprechender Ausrüstungen nicht zu unterschätzen. Durch Ausrüstungsversuche im semi-industriellen Maßstab konnte weiterhin gezeigt werden, dass einer großtechnischen Umsetzung der angewandten Ausrüstungen prinzipiell nichts im Wege steht. Je nach funktioneller Gruppe am Phosphor konnte die Wasserlöslichkeit und die Waschstabilität kontrolliert werden. Dabei konnte zum einen gezeigt werden, dass hydrophobes N-P-Silane eine bessere Waschbeständigkeit aufweisen, hydrophile N-P-Silane erhalten diese erst bei Fixierungstemperaturen von 180 °C. Ausgehend von den Ergebnissen konnten sticktstoffgenerierende und cyanursäure-basierte N-P-Silane entwickelt werden, welche sich besonders in einer guten Flammschutzwirkung bei Mischgeweben auszeichnen. Insgesamt konnte innerhalb des Forschungsvorhabens gezeigt werden, dass N-P-Silane hervorragende permanente Flammschutzmittel für Textilien sind und auf welchem Mechanismus dieser Flammschutz begründet ist.
With the continuous development of economy, consumers pay more attention to the demand for personalization clothing. However, the recommendation quality of the existing clothing recommendation system is not enough to meet the user’s needs. When browsing online clothing, facial expression is the salient information to understand the user’s preference. In this paper, we propose a novel method to automatically personalize clothing recommendation based on user emotional analysis. Firstly, the facial expression is classified by multiclass SVM. Next, the user’s multi-interest value is calculated using expression intensity that is obtained by hybrid RCNN. Finally, the multi-interest value is fused to carry out personalized recommendation. The experimental results show that the proposed method achieves a significant improvement over other algorithms.
”I have never seen one who loves virtue as much as he loves beauty,” Confucius once said. If beauty is more important as goodness, it becomes clear why people invest so much effort in their first impression. The aesthetic of faces has many aspects and there is a strong correlation to all characteristics of humans, like age and gender. Often, research on aesthetics by social and ethic scientists lacks sufficient labelled data and the support of machine vision tools. In this position paper we propose the Aesthetic-Faces dataset, containing training data which is labelled by Chinese and German annotators. As a combination of three image subsets, the AF-dataset consists of European, Asian and African people. The research communities in machine learning, aesthetics and social ethics can benefit from our dataset and our toolbox. The toolbox provides many functions for machine learning with state-of-the-art CNNs and an Extreme-Gradient-Boosting regressor, but also 3D Morphable Model technolo gies for face shape evaluation and we discuss how to train an aesthetic estimator considering culture and ethics.
3D assisted 2D face recognition involves the process of reconstructing 3D faces from 2D images and solving the problem of face recognition in 3D. To facilitate the use of deep neural networks, a 3D face, normally represented as a 3D mesh of vertices and its corresponding surface texture, is remapped to image-like square isomaps by a conformal mapping. Based on previous work, we assume that face recognition benefits more from texture. In this work, we focus on the surface texture and its discriminatory information content for recognition purposes. Our approach is to prepare a 3D mesh, the corresponding surface texture and the original 2D image as triple input for the recognition network, to show that 3D data is useful for face recognition. Texture enhancement methods to control the texture fusion process are introduced and we adapt data augmentation methods. Our results show that texture-map-based face recognition can not only compete with state-of-the-art systems under the same precon ditions but also outperforms standard 2D methods from recent years.
It has been widely shown that biomaterial surface topography can modulate host immune response, but a fundamental understanding of how different topographies contribute to pro-inflammatory or anti-inflammatory responses is still lacking. To investigate the impact of surface topography on immune response, we undertook a systematic approach by analyzing immune response to eight grades of medical grade polyurethane of increasing surface roughness in three in vitro models of the human immune system. Polyurethane specimens were produced with defined roughness values by injection molding according to the VDI 3400 industrial standard. Specimens ranged from 0.1 μm to 18 μm in average roughness (Ra), which was confirmed by confocal scanning microscopy. Immunological responses were assessed with THP-1-derived macrophages, human peripheral blood mononuclear cells (PBMCs), and whole blood following culture on polyurethane specimens. As shown by the release of pro-inflammatory and anti-inflammatory cytokines in all three models, a mild immune response to polyurethane was observed, however, this was not associated with the degree of surface roughness. Likewise, the cell morphology (cell spreading, circularity, and elongation) in THP-1-derived macrophages and the expression of CD molecules in the PBMC model on T cells (HLA-DR and CD16), NK cells (HLA-DR), and monocytes (HLA-DR, CD16, CD86, and CD163) showed no influence of surface roughness. In summary, this study shows that modifying surface roughness in the micrometer range on polyurethane has no impact on the pro-inflammatory immune response. Therefore, we propose that such modifications do not affect the immunocompatibility of polyurethane, thereby supporting the notion of polyurethane as a biocompatible material.
Appropriate mechanical properties and fast endothelialization of synthetic grafts are key to ensure long-term functionality of implants. We used a newly developed biostable polyurethane elastomer (TPCU) to engineer electrospun vascular scaffolds with promising mechanical properties (E-modulus: 4.8 ± 0.6 MPa, burst pressure: 3326 ± 78 mmHg), which were biofunctionalized with fibronectin (FN) and decorin (DCN). Neither uncoated nor biofunctionalized TPCU scaffolds induced major adverse immune responses except for minor signs of polymorph nuclear cell activation. The in vivo endothelial progenitor cell homing potential of the biofunctionalized scaffolds was simulated in vitro by attracting endothelial colony-forming cells (ECFCs). Although DCN coating did attract ECFCs in combination with FN (FN + DCN), DCN-coated TPCU scaffolds showed a cell-repellent effect in the absence of FN. In a tissue-engineering approach, the electrospun and biofunctionalized tubular grafts were cultured with primary-isolated vascular endothelial cells in a custom-made bioreactor under dynamic conditions with the aim to engineer an advanced therapy medicinal product. Both FN and FN + DCN functionalization supported the formation of a confluent and functional endothelial layer.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
Context: A product roadmap is an important tool in product development. It sets the strategic direction in which the product is to be developed to achieve the company’s vision. However, for product roadmaps to be successful, it is essential that all stakeholders agree with the company’s vision and objectives and are aligned and committed to a common product plan.
Objective: In order to gain a better understanding of product roadmap alignment, this paper aims at identifying measures, activities and techniques in order to align the different stakeholders around the product roadmap.
Method: We conducted a grey literature review according the guidelines to Garousi et al.
Results: Several approaches to gain alignment were identified such as defining and communicating clear objectives based on the product vision, conducting cross-functional workshops, shuttle diplomacy, and mission briefing. In addition, our review identified the “Behavioural Change Stairway Model” that suggests five steps to gain alignment by building empathy and a trustful relationship.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach.
Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts.
Method: We conducted semi-structured expert interviews with 15 experts from 13 German companies and conducted athematic data analysis.
Results: The analysis showed that a significant number of companies is still struggling with traditional feature-based product-roadmapping and opinion-based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and establishing discovery activities.
The emergence of agile methods and practices has not only changed the development processes but might also have affected how companies conduct software process improvement (SPI). Through a set of complementary studies, we aim to understand how SPI has changed in times of agile software development. Specifically, we aim (1) to identify and characterize the set of publications that connect elements of agility to SPI, (2) to explore to which extent agile methods/practices have been used in the context of SPI, and (3) to understand whether the topics addressed in the literature are relevant and useful for industry professionals. To study these questions, we conducted an in-depth analysis of the literature identified in a previous mapping study, an interview study, and an analysis of the responses given by industry professionals to SPI-related questions stemming from an independently conducted survey study.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Digital technologies are main strategic drivers for digitalization and offer ubiquitous data availability, unlimited connectivity, and massive processing power for a fundamentally changing business. This leads to the development and application of intelligent digital systems. The current state of research and practice of architecting digital systems and services lacks a solid methodological foundation that fully accommodates all requirements linked to efficient and effective development of digital systems in organizations. Research presented in this paper addresses the question, how management of complexity in digital systems and architectures can be supported from a methodological perspective. In this context, the current focus is on a better understanding of the causes of increased complexity and requirements to methodological support. For this purpose, we take an enterprise architecture perspective, i.e. how the introduction of digital systems affects the complexity of EA. Two industrial case studies and a systematic literature analysis result in the proposal of an extended Digital Enterprise Architecture Cube as framework for future methodical support.
Artificial Intelligence-based Assistants AIAs are spreading quickly both in homes and offices. They already have left their original habitats of "intelligent speakers" providing easy access to music collections. The initiated a multitude of new devices and are already populating devices such as TV sets. Characteristic for the intelligent digital assistants is the formation of platforms around their core functionality. Thus, AIS capabilities of the assistants are used to offer new services and create new interfaces for business processes. There are positive network effects between the assistants and the services as well as within the services. Therefore, many companies see the need to get involved in the field of digital assistants but lack a framework to align their initiatives with their corporate strategies. In order to lay the foundation for a comprehensive method, we are therefore investigating intelligent digital assistants. Based on this analysis, we are developing a framework of strategic opportunities and challenges.
Artificial Intelligence enables innovative applications, and applications based on Artificial Intelligence are increasingly important for all aspects of the Digital Economy. However, the question of how AI resources such as tools and data can be linked to provide an AI-capability and create business value is still open. Therefore, this paper identifies the value-creating mechanisms of connectionist artificial intelligence using a capability-oriented view and points out the connections to different kinds of business value. The analysis supports an agenda that identifies areas that need further research to understand the mechanism of value creation in connectionist artificial intelligence.
Scenario-based analysis is a comprehensive technique to evaluate software quality and can provide more detailed insights than e.g. maintainability metrics. Since such methods typically require significant manual effort, we designed a lightweight scenario-based evolvability evaluation method. To increase efficiency and to limit assumptions, the method exclusively targets service- and microservice-based systems. Additionally, we implemented web-based tool support for each step. Method and tool were also evaluated with a survey (N=40) that focused on change effort estimation techniques and hands-on interviews (N=7) that focused on usability. Based on the evaluation results, we improved method and tool support further. To increase reuse and transparency, the web-based application as well as all survey and interview artifacts are publicly available on GitHub. In its current state, the tool-supported method is ready for first industry case studies.
Context: Fast moving markets and the age of digitization require that software can be quickly changed or extended with new features. The associated quality attribute is referred to as evolvability: the degree of effectiveness and efficiency with which a system can be adapted or extended. Evolvability is especially important for software with frequently changing requirements, e.g. internet-based systems. Several evolvability-related benefits were arguably gained with the rise of service-oriented computing (SOC) that established itself as one of the most important paradigms for distributed systems over the last decade. The implementation of enterprise-wide software landscapes in the style of service-oriented architecture (SOA) prioritizes loose coupling, encapsulation, interoperability, composition, and reuse. In recent years, microservices quickly gained in popularity as an agile, DevOps-focused, and decentralized service-oriented variant with fine-grained services. A key idea here is that small and loosely coupled services that are independently deployable should be easy to change and to replace. Moreover, one of the postulated microservices characteristics is evolutionary design.
Problem Statement: While these properties provide a favorable theoretical basis for evolvable systems, they offer no concrete and universally applicable solutions. As with each architectural style, the implementation of a concrete microservice-based system can be of arbitrary quality. Several studies also report that software professionals trust in the foundational maintainability of service orientation and microservices in particular. A blind belief in these qualities without appropriate evolvability assurance can lead to violations of important principles and therefore negatively impact software evolution. In addition to this, very little scientific research has covered the areas of maintenance, evolution, or technical debt of microservices.
Objectives: To address this, the aim of this research is to support developers of microservices with appropriate methods, techniques, and tools to evaluate or improve evolvability and to facilitate sustainable long-term development. In particular, we want to provide recommendations and tool support for metric-based as well as scenario-based evaluation. In the context of service-based evolvability, we furthermore want to analyze the effectiveness of patterns and collect relevant antipatterns. Methods: Using empirical methods, we analyzed the industry state of the practice and the academic state of the art, which helped us to identify existing techniques, challenges, and research gaps. Based on these findings, we then designed new evolvability assurance techniques and used additional empirical studies to demonstrate and evaluate their effectiveness. Applied empirical methods were for example surveys, interviews, (systematic) literature studies, or controlled experiments.
Contributions: In addition to our analyses of industry practice and scientific literature, we provide contributions in three different areas. With respect to metric-based evolvability evaluation, we identified a set of structural metrics specifically designed for service orientation and analyzed their value for microservices. Subsequently, we designed tool-supported approaches to automatically gather a subset of these metrics from machine-readable RESTful API descriptions and via a distributed tracing mechanism at runtime. In the area of scenario-based evaluation, we developed a tool-supported lightweight method to analyze the evolvability of a service-based system based on hypothetical evolution scenarios. We evaluated the method with a survey (N=40) as well as hands-on interviews (N=7) and improved it further based on the findings. Lastly with respect to patterns and antipatterns, we collected a large set of service-based patterns and analyzed their applicability for microservices. From this initial catalogue, we synthesized a set of candidate evolvability patterns via the proxy of architectural modifiability tactics. The impact of four of these patterns on evolvability was then empirically tested in a controlled experiment (N=69) and with a metric-based analysis. The results suggest that the additional structural complexity introduced by the patterns as well as developers' pattern knowledge have an influence on their effectiveness. As a last contribution, we created a holistic collection of service-based antipatterns for both SOA and microservices and published it in a collaborative repository.
Conclusion: Our contributions provide first foundations for a holistic view on the evolvability assurance of microservices and address several perspectives. Metric- and scenario-based evaluation as well as service-based antipatterns can be used to identify "hot spots" while service-based patterns can remediate them and provide means for systematic evolvability construction. All in all, researchers and practitioners in the field of microservices can use our artifacts to analyze and improve the evolvability of their systems as well as to gain a conceptual understanding of service-based evolvability assurance.
Polyelectrolyte multilayer coatings (PEM) are prepared by alternative layer-by-layer deposition of cationic and anionic polyelectrolyte monolayers on charged surfaces. The thickness of the coatings ranges from nm to few μm. Their properties such as roughness, stiffness, surface charge and surface energy can be precisely tuned to fulfil different technical or biological requirements. The coating process is based on self-assembly of polyelectrolytes. Advantages of these coatings are their easy handling, no harsh chemistry and the possibility for coatings on complex geometries. The PEM coatings can be prepared from a variety of suitable polyelectrolytes. Their stability varies from very durable PEM coatings that are only soluble in strong solvents to quickly degradable, which may be applied as drug release system. One example of such a degradable PEM system is the one based on the polyelectrolyte pair Hyaluronan (HA) and Chitosan (CHI). These biopolymers originate from natural sources and show low toxicity towards human cells. However, HA/CHI multilayers show only weak adhesiveness for human umbilical vein endothelial cells (HUVEC). In this article, we summarize our approaches to enhance the HA/CHI multilayer by incorporation of a non-polymer substance –graphene oxide– to improve the cell adhesion and keep such properties as low cytotoxicity and biodegradability. Different approaches for incorporation of graphene oxide were performed and the cellular adhesion was tested by metabolic assay.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Drug-induced liver toxicity is one of the most common reasons for the failure of drugs in clinical trials and frequent withdrawal from the market. Reasons for such failures include the low predictive power of in vivo studies, that is mainly caused by metabolic differences between humans and animals, and intraspecific variances. In addition to factors such as age and genetic background, changes in drug metabolism can also be caused by disease-related changes in the liver. Such metabolic changes have also been observed in clinical settings, for example, in association with a change in liver stiffness, a major characteristic of an altered fibrotic liver. For mimicking these changes in an in vitro model, this study aimed to develop scaffolds that represent the rigidity of healthy and fibrotic liver tissue. We observed that liver cells plated on scaffolds representing the stiffness of healthy livers showed a higher metabolic activity compared to cells plated on stiffer scaffolds. Additionally, we detected a positive effect of a scaffold pre-coated with fetal calf serum (FCS)-containing media. This pre-incubation resulted in increased cell adherence during cell seeding onto the scaffolds. In summary, we developed a scaffold-based 3D model that mimics liver stiffness-dependent changes in drug metabolism that may more easily predict drug interaction in diseased livers.
Endogenous electrical fields play an important role in various physiological and pathological events. Yet the effects of electrical cues on processes such as wound healing, tumor development or metastasis are still rarely investigated, though it is known that direct current electrical fields can alter cell migration or proliferation in vitro. Several 2D experimental models for studying cell responses to direct current electrical fields have been presented and characterized but suitable experimental models for electrotaxis studies in 3D are rare. Here we present a novel, easy-to-produce, multi-well-based galvanotactic-chamber for the use in 2D and 3D cell experiments for investigations on the influence of electrical fields on tumor cell migration and tumor spheroid growth. Our presented system allows the simultaneous application of electrical field to cells in four chambers, either cultured on the bottom of the culture-plate (2D) or embedded in hydrogel filled channels(3D). The set-up is also suitable for, live-cell-imaging. Validation tests show stable electrical fields and high cell viabilities inside the channel. Tumor spheroids of various diameters can be exposed to direct current electrical fields up to one week.
Thermoplastic polycarbonate urethane elastomers (TPCU) are potential implant materials for treating degenerative joint diseases thanks to their adjustable rubber-like properties, their toughness, and their durability. We developed a water-containing high-molecular-weight sulfated hyaluronic acid-coating to improve the interaction of TPCU with the synovial fluid. It is suggested that trapped synovial fluid can act as a lubricant that reduces the friction forces and thus provides an enhanced abrasion resistance of TPCU implants. Aims of this work were (i) the development of a coating method for novel soft TPCU with high-molecular sulfated hyaluronic acid to increase the biocompatibility and (ii) the in vitro validation of the functionalized TPCUs in cell culture experiments.
Medical implants play a central role in modern medicine and both, naturally derived and synthetic materials have been explored as biomaterials for such devices. However, when implanted into living tissue, most materials initiate a host response. In addition, implants often cause bacterial infections leading to complications. Polyelectrolyte multilayer (PEM) coatings can be used for functionalization of medical implants improving the implant integration and reducing foreign body reactions. Some PEMs are also known to show antibacterial properties. We developed a PEM coating suggesting that it can decrease the risk of bacterial infections occurring after implantation while being highly biocompatible. We applied two different standard tests for evaluating the PEM’s antibacterial properties, the ISO norm (ISO 22196) and one ASTM norm (ASTM E2180) test. We found a reduction of bacterial growth on the PEM but to a different degree depending on the testing method. This result demonstrates the need for defining proper method to evaluate antibacterial properties of surface coatings.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Globalization has increased the number of road trips and vehicles. The result has been an intensification of traffic accidents, which are becoming one of the most important causes of death worldwide. Traffic accidents are often due to human error, the probability of which increases when the cognitive ability of the driver decreases. Cognitive capacity is closely related to the driver’s mental state, as well as other external factors such as the CO2 concentration inside the vehicle. The objective of this work is to analyze how these elements affect driving. We have conducted an experiment with 50 drivers who have driven for 25 min using a driving simulator. These drivers completed a survey at the start and end of the experiment to obtain information about their mental state. In addition, during the test, their stress level was monitored using biometric sensors and the state of the environment (temperature, humidity and CO2 level) was recorded. The results of the experiment show that the initial level of stress and tiredness of the driver can have a strong impact on stress, driving behavior and fatigue produced by the driving test. Other elements such as sadness and the conditions of the interior of the vehicle also cause impaired driving and affect compliance with traffic regulations.
Comparison of sleep characteristics measurements: a case study with a population aged 65 and above
(2020)
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Die OLED-Technologie wurde vor über zehn Jahren als Revolution in der Verpackungs-industrie gefeiert, die jedoch in der Praxis ausblieb. In einem industriellen Kooperations-projekt zur Zukunftsszenarienentwicklung der pharmazeutischen Verpackungsindustrie stellt sich die OLED-Technologie als Schlüsseltechnologie für das Zukunftsszenario Smart Packaging 2.0 dar.
High Performance Computing (HPC) enables significant progress in both science and industry. Whereas traditionally parallel applications have been developed to address the grand challenges in science, as of today, they are also heavily used to speed up the time-to-result in the context of product design, production planning, financial risk management, medical diagnosis, as well as research and development efforts. However, purchasing and operating HPC clusters to run these applications requires huge capital expenditures as well as operational knowledge and thus is reserved to large organizations that benefit from economies of scale. More recently, the cloud evolved into an alternative execution environment for parallel applications, which comes with novel characteristics such as on-demand access to compute resources, pay-per-use, and elasticity. Whereas the cloud has been mainly used to operate interactive multi-tier applications, HPC users are also interested in the benefits offered. These include full control of the resource configuration based on virtualization, fast setup times by using on-demand accessible compute resources, and eliminated upfront capital expenditures due to the pay-per-use billing model. Additionally, elasticity allows compute resources to be provisioned and decommissioned at runtime, which allows fine-grained control of an application's performance in terms of its execution time and efficiency as well as the related monetary costs of the computation. Whereas HPC-optimized cloud environments have been introduced by cloud providers such as Amazon Web Services (AWS) and Microsoft Azure, existing parallel architectures are not designed to make use of elasticity. This thesis addresses several challenges in the emergent field of High Performance Cloud Computing. In particular, the presented contributions focus on the novel opportunities and challenges related to elasticity. First, the principles of elastic parallel systems as well as related design considerations are discussed in detail. On this basis, two exemplary elastic parallel system architectures are presented, each of which includes (1) an elasticity controller that controls the number of processing units based on user-defined goals, (2) a cloud-aware parallel execution model that handles coordination and synchronization requirements in an automated manner, and (3) a programming abstraction to ease the implementation of elastic parallel applications. To automate application delivery and deployment, novel approaches are presented that generate the required deployment artifacts from developer-provided source code in an automated manner while considering application-specific non-functional requirements. Throughout this thesis, a broad spectrum of design decisions related to the construction of elastic parallel system architectures is discussed, including proactive and reactive elasticity control mechanisms as well as cloud-based parallel processing with virtual machines (Infrastructure as a Service) and functions (Function as a Service). To evaluate these contributions, extensive experimental evaluations are presented.
The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2020) continued a series of events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings: revenue-generating solutions that leverage digital technologies to address customer needs. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This briefing describes how Munich Re addressed these common challenges by building a foundation for experimenting more systematically and successfully with digital offerings. The foundation has enabled Munich Re to become a serial innovator of digital offerings.
Lean Management hat in viele Unternehmen Einzug gehalten. Lean Konzepte stellen neue Anforderungen an die Art und Struktur der benötigten Kosteninformation, welche von traditionallen Kostenrechnungssystemen nicht unmittelbar erfüllt werden. Vertreter eines „Lean Accounting“ schlagen deshalb teils radikale Änderungen und eine Vereinfachung der Kostenrechnung vor. Der Beitrag diskutiert die Beschränkungen der traditionellen Kostenrechnung bei der Umsetzung von Lean Management und stellt ausgewählte Ansätze eines „Accounting for Lean“ vor. Die Analyse zeigt, dass Ansätze des Lean Accounting zu eng fokussiert sind und die in der Praxis vorhandene Pluralität der Kostenrechnungsfunktionen nicht adäquat abbilden können. Eine radikale Neugestaltung bestehender Kostenrechnungssysteme wird deshalb als unrealistisch und unbegründet verworfen. Der Beitrag entwickelt alternative Vorschläge, wie Konzepte des Lean Managements und die dafür benötigte Kosteninformation in traditionellen Kostenrechnungssystemen integriert werden können.
Industrielle Produktionseinrichtungen haben mit rund 40% einen signifikanten Anteil am Gesamtenergiebedarf in Deutschland. Daher wurden und werden sie sowohl technologisch als auch energetisch optimiert. Häufig geht die technologischwirtschaftliche Optimierung auch mit der Reduzierung des Energie und Materialverbrauchs einher. Zudem macht der Ausbau der regenerativen Energiequellen die Energieerzeugung zunehmend volatiler, sodass nicht nur die Senkung des absoluten Energieverbrauchs, sondern auch eine höhere Flexibilität (Steuerung der Leistung über der Zeit) zunehmend interessanter wird. Dadurch ändert sich oft die installierte Leistung sowie die Gestaltung der Verlustleistungsabfuhr, was die Dimensionierung von Anlagen, zum Beispiel von spanenden Werkzeugmaschinen, beeinflusst.