Ja
Refine
Year of publication
- 2020 (152) (remove)
Document Type
- Journal article (77)
- Conference proceeding (48)
- Report (9)
- Book chapter (5)
- Doctoral Thesis (4)
- Book (3)
- Working Paper (3)
- Issue of a journal (2)
- Anthology (1)
Is part of the Bibliography
- yes (152)
Institute
- Informatik (58)
- ESB Business School (43)
- Life Sciences (25)
- Technik (19)
- Texoversum (6)
Publisher
- Hochschule Reutlingen (20)
- Elsevier (18)
- MDPI (13)
- De Gruyter (7)
- Gesellschaft für Informatik e.V (6)
- AMD Akademie Mode & Design (5)
- Springer (5)
- Deutsches Textilforschungszentrum Nord-West (4)
- MIM, Marken-Institut München (4)
- IARIA (3)
Customer orientation should be the core engine of every organisation. Information technology can be considered as the enabler to generate competitive advantages through customer processes in marketing, sales and service. The impact of information technologies is the biggest risk and at the same time a huge opportunity for any organisation. Research shows that Customer Relationship Management (CRM) enables organisations to perform better and focus more on their customers (e.g. market capitalisation of Amazon). While global enterprises are shaping the future of customer centricity and information technology, the question arises how German B2B organisations can shift their value contribution from product-centric to customer-centric. Therefore, these organisations are attempting to implement CRM software and putting their customers more into focus. However, the question remains, how organisations are approaching the implementation of CRM and if these attempts are paying off in terms of business performance.
Contributing to this highly topical discussion, this thesis contributes to the body of knowledge about the implementation of CRM in the German B2B sector and how it impacts their business performance. First, theoretical frameworks have been developed based on an extensive literature review. Hereby different aspects of CRM are worked-out and mapped against three dimensions of business performance, namely process efficiency, customer satisfaction and financial performance. Based on the theory, a conceptual framework was developed to test the relationships between CRM and Business Performance (BP). Therefore, a survey with 500 participants has been conducted. Based on this a measurement model was developed to test five main hypotheses.
The findings of these hypotheses suggest, that the implementation of CRM positively impacts business performance. In specific, the usage of analytical CRM and the establishment of a dedicated CRM success measurement correlate with the performance of German B2B organisations. In addition to these main findings, various key statements could be derived from the research and a measurement model was developed, which can be used for different organisational characteristics assessing BP. As a result, CRM implementations can be enhanced, and business performance can be improved.
Der ›AM Field Guide‹ gibt einen ersten strukturierten Überblick in die komplexe und vielschichtige Welt der additiven Fertigungsverfahren. Getrennt nach den Materialien Polymere, Metalle und weitere Materialien werden die gängigsten jeweils auf dem Markt angebotenen AM-Verfahren schematisch dargestellt und der Verfahrenskern in Kurzform beschrieben. Neben den hier dargestellten Hauptverfahren gibt es viele Derivate und Sonderverfahren, die ebenfalls ein gesetzt werden jedoch nicht explizit gezeigt sind. Zu beachten ist, dass in dem jungen Bereich der additiven Fertigung viele Hersteller ihren AM-Anwendungen eigene Namen geben, so dass eine allgemeingültige umfassende Klassifizierung nur ansatzweise erreichbar ist.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
The development of new materials that mimic cartilage and its function is an unmet need that will allow replacing the damaged parts of the joints, instead of the whole joint. Polyvinyl alcohol (PVA) hydrogels have raised special interest for this application due to their biocompatibility, high swelling capacity and chemical stability. In this work, the effect of post-processing treatments (annealing, high hydrostatic pressure (HHP) and gamma-radiation) on the performance of PVA gels obtained by cast-drying was investigated and, their ability to be used as delivery vehicles of the anti-inflammatories diclofenac or ketorolac was evaluated. HHP damaged the hydrogels, breaking some bonds in the polymeric matrix, and therefore led to poor mechanical and tribological properties. The remaining treatments, in general, improved the performance of the materials, increasing their crystallinity. Annealing at 150 °C generated the best mechanical and tribological results: higher resistance to compressive and tensile loads, lower friction coefficients and ability to support higher loads in sliding movement. This material was loaded with the anti-inflammatories, both without and with vitamin E (Vit.E) or Vit.E + cetalkonium chloride (CKC). Vit.E + CKC helped to control the release of the drugs which occurred in 24 h. The material did not induce irritability or cytotoxicity and, therefore, shows high potential to be used in cartilage replacement with a therapeutic effect in the immediate postoperative period.
Purpose. To improve the efficiency of the closed-cycle operation of the field-orientation induction machine in dynamic behavior when load conditions are changing, considering the nonlinearities of the main inductance.
Methodology. The optimal control problem is defined as the minimization of the time integral of the energy losses. The algorithm observed in this paper uses the Matlab/Simulink, dSPACE real-time interface, and C language. Handling real-time applications is made in ControlDesk experiment software for seamless ECU development.
Findings. Adiscrete-time model with an integrated predictive control scheme where the optimization is performed online at every sampling step has been developed. The optimal field-producing current trajectory is determined, so that the copper losses are minimized over a wide operational range. Additionally, the comparison of measurement results with conventional methods is provided, which validates the advantages and performance of the control scheme.
Originality. To solve the given problem, the information vector on the current state of the coordinates of the electromechanical system is used to form a controlling influence in the dynamic mode of operation. For the first time, the formation process of controls has considered the current state and the desired future state of the system in the real-time domain.
Practical value. Apredictive iterative approach for optimal flux level of an induction machine is important to generate the required electromagnetic torque and to reduce power losses simultaneously.
The planning and control of intralogistics systems in line with versatile production systems of smart factories requires new approaches and methods to cope with changing requirements within future factories. The planning of intralogistics can no longer follow a static, sequential approach as in the past since the planning assumptions are going to change in a high frequency. Reasons for these constant changes are amongst others external turbulences like rapidly changing market conditions, decreasing batch sizes down to customer-specific products with a batch size of one and on the other hand internal turbulences (like production and logistic resource breakdowns) affecting the production system. This paper gives an insight into research approaches and results how capabilities of intelligent logistical objects (intelligent bins, autonomous transport systems etc.) can be used to achieve a self-organized, cost and performance optimized intralogistics system with autonomously controlled process execution within versatile production environments. A first consistent method has been developed which has been validated and implemented within a scenario at the pilot factory Werk150 at the ESB Business School (Reutlingen University). Based on the incoming production orders, the method of the Extended Profitability Appraisal (EPA) covering the work system value to define the most effective work system for order fulfilment is applied. To derive the appropriate intralogistics processes, an autonomous control method involving principles of decentralized and target-oriented decision-making (e.g. intelligent bins are interacting with autonomously controlled transport systems to fulfil material orders of assembly workstations) has been developed and applied to achieve a target-optimized process execution. The results of the first stage research using predefined material sources and sinks described in this paper is going to set the basis for the further development of a self-organized and autonomously controlled method for intralogistics systems considering dynamic source and sink relations. By allowing dynamic shifts of production orders in the sense of dynamic source and sink relations the cost and performance aims of the intralogistics system can be directly aligned with the aims of the entire versatile production system in the sense of self-organized and autonomously controlled systems.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory.
Learning factories can complement each other by training different competencies in the field of digitalisation and Industry 4.0. They depict diverse sections of the product development process and focus on various technologies. Within the framework of the International Association of Learning Factories (IALF), the operating organisations of learning factories exchange information on research, training and education. One of the aims is to develop joint projects. The article presents different concepts of cooperation between learning factories while focusing on the improvement of the development of learners competencies e.g. with a broader range of topics. A concept of a joint course between the learning factories in Bochum, Reutlingen and Darmstadt is explained in detail. The three learning factories will be examined with regard to their similarities and differences. The joint course focuses on the target group of students and the topic of digitalisation in the development and production of products. The course and its contents are explained in detail. The new learning approach is evaluated on the basis of feedback from the participants. Finally, challenges resulting from the cooperation between learning factories at different locations and with different operating models will be discussed.
This paper presents a novel emulation concept for the test of smart contracts and Distributed Ledger Technologies (DLT) in distribute control or energy economy tasks and use cases. The concept uses state of the art behavioral modeling tools such as Matlab Simulink but presents a possible way to solve the shortfall of Simulink in communicating to DLT-Nodes directly. This is solved through a middleware solution. After this, an example used in verifying the test bed is presented and the target demonstration object is described. Finally, the possible expansion of the system is discussed and presented.
This paper aims at presenting a solution that enables end customers of the energy system to participate in new local micro-energy-markets by providing them with a distributed, decentralized, transparent and secure Peer to Peer (P2P) payment system, which functions automatically applying new concepts of Machine to Machine (M2M) communication technologies. This work was performed within the German project VK_2G, funded by the DBU. The key results were: Providing means to perform microtransactions in a P2P fashion between end consumers and prosumers in local communities at low cost in a transparent and secure manner; Developing a platform with pre-defined smart contracts able to be tailored to different end customers ‘needs in an easy way and; Integrating both the market platform as well as the local control of generation and loads. This solution has been developed, integrated and tested in a laboratory prototype. This paper discusses this solution and presents the results of the first test.
Facial expressions play a dominant role in facilitating social interactions. We endeavor to develop tactile displays to reinstate facial expression modulated communication. The high spatial and temporal dimensionality of facial movements poses a unique challenge when designing tactile encodings of them. A further challenge is developing encodings that are at-tuned to the perceptual characteristics of our skin. A caveat of using vibrotactile displays is that tactile stimuli have been shown to induce perceptual tactile aftereffects when used on the fingers, arm and face. However, at present, despite the prevalence of waist-worn tactile displays, no such investigations of tactile aftereffects at the waist region exist in the literature, though they are warranted by the unique sensory and perceptual signalling characteristics of this area. Using an adaptation paradigm we investigated the presence of perceptual tactile aftereffects induced by continuous and burst vibrotactile stimuli delivered at the navel, side and spinal regions of the waist. We report evidence that the tactile perception topology of the waist is non-uniform, and specifically that the navel and spine regions are resistant to adaptive aftereffects while side regions are more prone to perceptual adaptations to continuous but not burst stimulations. Results of our current investigations highlight the unique set of challenges posed by designing waist-worn tactile displays. These and future perceptual studies can directly inform more realistic and effective implementations of complex high-dimensional spatiotemporal social cues.
Deutschland, quo vadis?
(2020)
Shutdown in Deutschland im März 2020. Stillstand in Handel und Industrie. Der Börsenwert einer beachtlichen Anzahl von Unternehmen hat sich in kürzester Zeit halbiert. Anleger warfen alles auf den Markt. Und bei der hohen Unsicherheit verloren sämtliche Anlageklassen, zeitweise sogar Gold. Selbst Konzerne wie die Lufthansa werden es ohne Staatshilfe nicht mehr schaffen zu existieren.
Im Projekt GalvanoFlex_BW sind verschiedene Möglichkeiten zur Verbesserung der Energieeffizienz von energieintensiven Industrieunternehmen aufgezeigt und untersucht worden. Die Einführung der KWK im stromoptimierten Betrieb stellte dabei einen besonders betrachteten Aspekt dar. Neben der technischen Untersuchung ist zudem eine sozialwissenschaftliche Betrachtung vorgenommen worden, um die Einführung und Umsetzung entsprechender Maßnahmen auch unter diesem Aspekt zu betrachten. Ein zusätzlicher wichtiger Schwerpunkt des Projektes war die Übertragung des erarbeiteten Wissens an weitere Unternehmen, Institutionen etc., die nicht direkt am Projekt beteiligt waren.
Durch die Zusammenarbeit von vier Forschungs- und drei Industriepartnern sind die Arbeiten praxisorientiert auf Basis realer Messdaten sowie im Zuge von Befragungen der handelnden Personen bei den Projektpartnern im Sinne eines Reallabors durchgeführt worden.
Die Notwendigkeit zur Einsparung von Energie und damit zur Umsetzung von Effizienzmaßnahmen ist ein wichtiger Schritt, um die von der EU geplante Reduktion von Treibhausgasen zu erreichen. Darüber hinaus ist zu erwarten, dass die Energiekosten auch zukünftig weiter ansteigen. Dadurch werden Maßnahmen zur Steigerung der Energieeffizienz zu einem wichtigen Element, eine wettbewerbsfähige Produktion zu gewährleisten. Im Rahmen des Projektes sind deshalb verschiedene Energieeffizienzmaßnahmen speziell für die Galvanotechnik recherchiert, analysiert und in einen Maßnahmenkatalog überführt worden. Des Weiteren wurde eine Bewertungsmethode entwickelt, die die Unternehmen der Galvanotechnik bei der Identifikation von sinnvollen Energieeffizienzmaßnahmen unterstützen soll.
Bei den Untersuchungen zur Umsetzung von Kraft-Wärme-Kopplungsanlagen konnte am Beispiel von zwei Unternehmen mit stark unterschiedlichen Strom- und Wärmebedarfswerten gezeigt werden, dass der Einsatz entsprechender Anlagen wirtschaftlich lohnenswert ist. Im günstigsten Fall ergeben sich Amortisationszeiten von etwa zwei Jahren. Es hat sich weiterhin gezeigt, dass die Auslegung des Block-heizkraftwerkes stark von den Strom- und Wärmebedarfswerten abhängt und dass der Pufferspeicher keinesfalls zu klein ausgelegt werden sollte. Eine intelligente stromoptimierte Steuerung mit Lastspitzenmanagement kann die Wirtschaftlichkeit jedoch häufig nur noch wenig gegenüber dem wärmegeführten Betrieb verbessern. Ursache dafür ist, dass das derzeitige Förder- und Vergütungssystem für Kraft-Wärme-Kopplungsanlagen nahezu keine Anreize für einen am Strombedarf und damit an der Deckung von Residuallast orientierten Betrieb bietet. Einzig bei Unternehmen mit ausgeprägten Spitzen im Strombezug, kann der gezielte Einsatz eines Blockheizkraftwerkes zu einer signifikanten Senkung des Leistungspreises führen.
Darüber hinaus ist die Einführung einer Kraft-Wärme-Kopplungsanlage generell als komplexe Energieeffizienzmaßnahme anzusehen, die deshalb neben den wirtschaftlichen Aspekten erhöhte Anforderungen an die Unternehmen und das professionelle Umfeld stellt, die im Rahmen der sozialwissenschaftlichen Begleitforschung untersucht worden sind. Dabei konnten Treiber aber auch Hemmnisse zur Umsetzung der Technologie sowohl innerhalb der Unternehmen als auch außerhalb identifiziert werden. Die internen Hemmnisse sind dabei auf unterschiedliche Ursachen zurückzuführen, wie die hohe Komplexität der KWK-Technologie, die schwierige Bewertung des Gesamtnutzens im Unternehmen, die mangelnde personelle Ausstattung und fehlende Unternehmerentscheidungen. Abhilfe schaffen, und damit als Treiber wirken, können hier ein verbessertes Beratungsangebot insbesondere seitens neutraler Stellen sowie der Anlagenbetrieb im Contracting.
Die Übertragung der im Projekt erarbeiteten Ergebnisse ist bereits während der Projektlaufzeit über die Branchenplattform im Zuge verschiedener Workshops, die speziell auf Unternehmen und Institutionen außerhalb des Projektkonsortiums ausgerichtet waren, erfolgt. Zur Verstärkung der Verbreitung des erarbeiteten Wissens ist zum Projektende eine Serie aus vier Fachartikeln in einem namhaften Branchenmagazin erschienen, und es ist eine Internetseite zum Projekt erstellt worden (www.galvanoflex_bw.de). Letztere hat dabei nicht nur die Aufgabe der Wissensverbreitung, sondern sie soll auch über das Projektende hinaus als Kontaktplattform dienen, um die Umsetzung von Effizienzmaßnahmen mit dem im Projekt generierten Wissen zu unterstützen.
Automatic anode rod inspection in aluminum smelters using deep-learning techniques: a case study
(2020)
Automatic fault detection using machine learning has become an exciting and promising area of research. This because it accurate and timely way to manage and classify with minimal human effort. In the computer vision community, deep-learning methods have become the most suitable approaches for this task. Anodes are large carbon blocks that are used to conduct electricity during the aluminum reduction process. The most basic function of anode rod inspection is to prevent a situation where the anode rod will not fit into the stub-holes of a new anode. It would be the case for a rod containing either severe toe-in, missing stubs, or a retained thimble on one or more stubs. In this work, to improve the accuracy of shape defect inspection for an anode rod, we use the Fast Region-based Convolutional Network method (Fast R-CNN), model. To train the detection model, we collect an image dataset composed of multi-class of anode rod defects with annotated labels. Our model is trained using a small number of samples, an essential requirement in the industry where the number of available defective samples is limited. It can simultaneously detect multi-class of defects of the anode rod in nearly real-time.
Sol-Gel basierte Flammschutzmittel stellen einen vielversprechenden Ansatz für Textilien dar, gerade im Bereich des Ersatzes von derzeit etablierten halogenhaltigen Flammschutzmitteln. Letztere sind aufgrund ihrer toxikologisch Bedenklichkeit sowie ihrer mitunter bioakkumulierenden Eigenschaften in die Kritik geraten. In diesem Forschungsvorhaben wurde daher untersucht wie aus Phosphor- und stickstoffhaltige Silane halogenfreie Flammschutzmittel verwirklicht werden können. Die Sol-Gel-Schicht fungierte dabei zum einen als nicht brennbarer Binder, zum anderen konnten über das Anbinden von Phosphorgruppen in an kommerziell verfügbare Silane Flammschutz aktive Gruppen direkt mit eingebunden werden. Verschiedene Syntheseansätze wurden dabei verfolgt, wobei durch alle hergestellten N-P-Silane ein Flammschutz nach DIN EN ISO 15025 (Schutzkleidung – Schutz gegen Hitze und Flammen) erhalten wurden. Dabei hängt die Flammschutzwirkung stark von den funktionellen Gruppen und der Oxidationsstufe des Phosphors ab, dabei konnte ein entsprechender Flammschutz bei Auflagen von 5 % erzielt werden. Es konnte gezeigt werden, dass ein Mechanismus auf Basis der Bildung einer Schutzschicht hauptsächlich verantwortlich für den Flammschutz ist. Dieses Ergebnis ist für eine zukünftige, weitere Optimierung entsprechender Ausrüstungen nicht zu unterschätzen. Durch Ausrüstungsversuche im semi-industriellen Maßstab konnte weiterhin gezeigt werden, dass einer großtechnischen Umsetzung der angewandten Ausrüstungen prinzipiell nichts im Wege steht. Je nach funktioneller Gruppe am Phosphor konnte die Wasserlöslichkeit und die Waschstabilität kontrolliert werden. Dabei konnte zum einen gezeigt werden, dass hydrophobes N-P-Silane eine bessere Waschbeständigkeit aufweisen, hydrophile N-P-Silane erhalten diese erst bei Fixierungstemperaturen von 180 °C. Ausgehend von den Ergebnissen konnten sticktstoffgenerierende und cyanursäure-basierte N-P-Silane entwickelt werden, welche sich besonders in einer guten Flammschutzwirkung bei Mischgeweben auszeichnen. Insgesamt konnte innerhalb des Forschungsvorhabens gezeigt werden, dass N-P-Silane hervorragende permanente Flammschutzmittel für Textilien sind und auf welchem Mechanismus dieser Flammschutz begründet ist.
With the continuous development of economy, consumers pay more attention to the demand for personalization clothing. However, the recommendation quality of the existing clothing recommendation system is not enough to meet the user’s needs. When browsing online clothing, facial expression is the salient information to understand the user’s preference. In this paper, we propose a novel method to automatically personalize clothing recommendation based on user emotional analysis. Firstly, the facial expression is classified by multiclass SVM. Next, the user’s multi-interest value is calculated using expression intensity that is obtained by hybrid RCNN. Finally, the multi-interest value is fused to carry out personalized recommendation. The experimental results show that the proposed method achieves a significant improvement over other algorithms.
”I have never seen one who loves virtue as much as he loves beauty,” Confucius once said. If beauty is more important as goodness, it becomes clear why people invest so much effort in their first impression. The aesthetic of faces has many aspects and there is a strong correlation to all characteristics of humans, like age and gender. Often, research on aesthetics by social and ethic scientists lacks sufficient labelled data and the support of machine vision tools. In this position paper we propose the Aesthetic-Faces dataset, containing training data which is labelled by Chinese and German annotators. As a combination of three image subsets, the AF-dataset consists of European, Asian and African people. The research communities in machine learning, aesthetics and social ethics can benefit from our dataset and our toolbox. The toolbox provides many functions for machine learning with state-of-the-art CNNs and an Extreme-Gradient-Boosting regressor, but also 3D Morphable Model technolo gies for face shape evaluation and we discuss how to train an aesthetic estimator considering culture and ethics.
3D assisted 2D face recognition involves the process of reconstructing 3D faces from 2D images and solving the problem of face recognition in 3D. To facilitate the use of deep neural networks, a 3D face, normally represented as a 3D mesh of vertices and its corresponding surface texture, is remapped to image-like square isomaps by a conformal mapping. Based on previous work, we assume that face recognition benefits more from texture. In this work, we focus on the surface texture and its discriminatory information content for recognition purposes. Our approach is to prepare a 3D mesh, the corresponding surface texture and the original 2D image as triple input for the recognition network, to show that 3D data is useful for face recognition. Texture enhancement methods to control the texture fusion process are introduced and we adapt data augmentation methods. Our results show that texture-map-based face recognition can not only compete with state-of-the-art systems under the same precon ditions but also outperforms standard 2D methods from recent years.
It has been widely shown that biomaterial surface topography can modulate host immune response, but a fundamental understanding of how different topographies contribute to pro-inflammatory or anti-inflammatory responses is still lacking. To investigate the impact of surface topography on immune response, we undertook a systematic approach by analyzing immune response to eight grades of medical grade polyurethane of increasing surface roughness in three in vitro models of the human immune system. Polyurethane specimens were produced with defined roughness values by injection molding according to the VDI 3400 industrial standard. Specimens ranged from 0.1 μm to 18 μm in average roughness (Ra), which was confirmed by confocal scanning microscopy. Immunological responses were assessed with THP-1-derived macrophages, human peripheral blood mononuclear cells (PBMCs), and whole blood following culture on polyurethane specimens. As shown by the release of pro-inflammatory and anti-inflammatory cytokines in all three models, a mild immune response to polyurethane was observed, however, this was not associated with the degree of surface roughness. Likewise, the cell morphology (cell spreading, circularity, and elongation) in THP-1-derived macrophages and the expression of CD molecules in the PBMC model on T cells (HLA-DR and CD16), NK cells (HLA-DR), and monocytes (HLA-DR, CD16, CD86, and CD163) showed no influence of surface roughness. In summary, this study shows that modifying surface roughness in the micrometer range on polyurethane has no impact on the pro-inflammatory immune response. Therefore, we propose that such modifications do not affect the immunocompatibility of polyurethane, thereby supporting the notion of polyurethane as a biocompatible material.
Appropriate mechanical properties and fast endothelialization of synthetic grafts are key to ensure long-term functionality of implants. We used a newly developed biostable polyurethane elastomer (TPCU) to engineer electrospun vascular scaffolds with promising mechanical properties (E-modulus: 4.8 ± 0.6 MPa, burst pressure: 3326 ± 78 mmHg), which were biofunctionalized with fibronectin (FN) and decorin (DCN). Neither uncoated nor biofunctionalized TPCU scaffolds induced major adverse immune responses except for minor signs of polymorph nuclear cell activation. The in vivo endothelial progenitor cell homing potential of the biofunctionalized scaffolds was simulated in vitro by attracting endothelial colony-forming cells (ECFCs). Although DCN coating did attract ECFCs in combination with FN (FN + DCN), DCN-coated TPCU scaffolds showed a cell-repellent effect in the absence of FN. In a tissue-engineering approach, the electrospun and biofunctionalized tubular grafts were cultured with primary-isolated vascular endothelial cells in a custom-made bioreactor under dynamic conditions with the aim to engineer an advanced therapy medicinal product. Both FN and FN + DCN functionalization supported the formation of a confluent and functional endothelial layer.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
Context: A product roadmap is an important tool in product development. It sets the strategic direction in which the product is to be developed to achieve the company’s vision. However, for product roadmaps to be successful, it is essential that all stakeholders agree with the company’s vision and objectives and are aligned and committed to a common product plan.
Objective: In order to gain a better understanding of product roadmap alignment, this paper aims at identifying measures, activities and techniques in order to align the different stakeholders around the product roadmap.
Method: We conducted a grey literature review according the guidelines to Garousi et al.
Results: Several approaches to gain alignment were identified such as defining and communicating clear objectives based on the product vision, conducting cross-functional workshops, shuttle diplomacy, and mission briefing. In addition, our review identified the “Behavioural Change Stairway Model” that suggests five steps to gain alignment by building empathy and a trustful relationship.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach.
Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts.
Method: We conducted semi-structured expert interviews with 15 experts from 13 German companies and conducted athematic data analysis.
Results: The analysis showed that a significant number of companies is still struggling with traditional feature-based product-roadmapping and opinion-based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and establishing discovery activities.
The emergence of agile methods and practices has not only changed the development processes but might also have affected how companies conduct software process improvement (SPI). Through a set of complementary studies, we aim to understand how SPI has changed in times of agile software development. Specifically, we aim (1) to identify and characterize the set of publications that connect elements of agility to SPI, (2) to explore to which extent agile methods/practices have been used in the context of SPI, and (3) to understand whether the topics addressed in the literature are relevant and useful for industry professionals. To study these questions, we conducted an in-depth analysis of the literature identified in a previous mapping study, an interview study, and an analysis of the responses given by industry professionals to SPI-related questions stemming from an independently conducted survey study.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Digital technologies are main strategic drivers for digitalization and offer ubiquitous data availability, unlimited connectivity, and massive processing power for a fundamentally changing business. This leads to the development and application of intelligent digital systems. The current state of research and practice of architecting digital systems and services lacks a solid methodological foundation that fully accommodates all requirements linked to efficient and effective development of digital systems in organizations. Research presented in this paper addresses the question, how management of complexity in digital systems and architectures can be supported from a methodological perspective. In this context, the current focus is on a better understanding of the causes of increased complexity and requirements to methodological support. For this purpose, we take an enterprise architecture perspective, i.e. how the introduction of digital systems affects the complexity of EA. Two industrial case studies and a systematic literature analysis result in the proposal of an extended Digital Enterprise Architecture Cube as framework for future methodical support.
Artificial Intelligence-based Assistants AIAs are spreading quickly both in homes and offices. They already have left their original habitats of "intelligent speakers" providing easy access to music collections. The initiated a multitude of new devices and are already populating devices such as TV sets. Characteristic for the intelligent digital assistants is the formation of platforms around their core functionality. Thus, AIS capabilities of the assistants are used to offer new services and create new interfaces for business processes. There are positive network effects between the assistants and the services as well as within the services. Therefore, many companies see the need to get involved in the field of digital assistants but lack a framework to align their initiatives with their corporate strategies. In order to lay the foundation for a comprehensive method, we are therefore investigating intelligent digital assistants. Based on this analysis, we are developing a framework of strategic opportunities and challenges.
Artificial Intelligence enables innovative applications, and applications based on Artificial Intelligence are increasingly important for all aspects of the Digital Economy. However, the question of how AI resources such as tools and data can be linked to provide an AI-capability and create business value is still open. Therefore, this paper identifies the value-creating mechanisms of connectionist artificial intelligence using a capability-oriented view and points out the connections to different kinds of business value. The analysis supports an agenda that identifies areas that need further research to understand the mechanism of value creation in connectionist artificial intelligence.
Scenario-based analysis is a comprehensive technique to evaluate software quality and can provide more detailed insights than e.g. maintainability metrics. Since such methods typically require significant manual effort, we designed a lightweight scenario-based evolvability evaluation method. To increase efficiency and to limit assumptions, the method exclusively targets service- and microservice-based systems. Additionally, we implemented web-based tool support for each step. Method and tool were also evaluated with a survey (N=40) that focused on change effort estimation techniques and hands-on interviews (N=7) that focused on usability. Based on the evaluation results, we improved method and tool support further. To increase reuse and transparency, the web-based application as well as all survey and interview artifacts are publicly available on GitHub. In its current state, the tool-supported method is ready for first industry case studies.
Context: Fast moving markets and the age of digitization require that software can be quickly changed or extended with new features. The associated quality attribute is referred to as evolvability: the degree of effectiveness and efficiency with which a system can be adapted or extended. Evolvability is especially important for software with frequently changing requirements, e.g. internet-based systems. Several evolvability-related benefits were arguably gained with the rise of service-oriented computing (SOC) that established itself as one of the most important paradigms for distributed systems over the last decade. The implementation of enterprise-wide software landscapes in the style of service-oriented architecture (SOA) prioritizes loose coupling, encapsulation, interoperability, composition, and reuse. In recent years, microservices quickly gained in popularity as an agile, DevOps-focused, and decentralized service-oriented variant with fine-grained services. A key idea here is that small and loosely coupled services that are independently deployable should be easy to change and to replace. Moreover, one of the postulated microservices characteristics is evolutionary design.
Problem Statement: While these properties provide a favorable theoretical basis for evolvable systems, they offer no concrete and universally applicable solutions. As with each architectural style, the implementation of a concrete microservice-based system can be of arbitrary quality. Several studies also report that software professionals trust in the foundational maintainability of service orientation and microservices in particular. A blind belief in these qualities without appropriate evolvability assurance can lead to violations of important principles and therefore negatively impact software evolution. In addition to this, very little scientific research has covered the areas of maintenance, evolution, or technical debt of microservices.
Objectives: To address this, the aim of this research is to support developers of microservices with appropriate methods, techniques, and tools to evaluate or improve evolvability and to facilitate sustainable long-term development. In particular, we want to provide recommendations and tool support for metric-based as well as scenario-based evaluation. In the context of service-based evolvability, we furthermore want to analyze the effectiveness of patterns and collect relevant antipatterns. Methods: Using empirical methods, we analyzed the industry state of the practice and the academic state of the art, which helped us to identify existing techniques, challenges, and research gaps. Based on these findings, we then designed new evolvability assurance techniques and used additional empirical studies to demonstrate and evaluate their effectiveness. Applied empirical methods were for example surveys, interviews, (systematic) literature studies, or controlled experiments.
Contributions: In addition to our analyses of industry practice and scientific literature, we provide contributions in three different areas. With respect to metric-based evolvability evaluation, we identified a set of structural metrics specifically designed for service orientation and analyzed their value for microservices. Subsequently, we designed tool-supported approaches to automatically gather a subset of these metrics from machine-readable RESTful API descriptions and via a distributed tracing mechanism at runtime. In the area of scenario-based evaluation, we developed a tool-supported lightweight method to analyze the evolvability of a service-based system based on hypothetical evolution scenarios. We evaluated the method with a survey (N=40) as well as hands-on interviews (N=7) and improved it further based on the findings. Lastly with respect to patterns and antipatterns, we collected a large set of service-based patterns and analyzed their applicability for microservices. From this initial catalogue, we synthesized a set of candidate evolvability patterns via the proxy of architectural modifiability tactics. The impact of four of these patterns on evolvability was then empirically tested in a controlled experiment (N=69) and with a metric-based analysis. The results suggest that the additional structural complexity introduced by the patterns as well as developers' pattern knowledge have an influence on their effectiveness. As a last contribution, we created a holistic collection of service-based antipatterns for both SOA and microservices and published it in a collaborative repository.
Conclusion: Our contributions provide first foundations for a holistic view on the evolvability assurance of microservices and address several perspectives. Metric- and scenario-based evaluation as well as service-based antipatterns can be used to identify "hot spots" while service-based patterns can remediate them and provide means for systematic evolvability construction. All in all, researchers and practitioners in the field of microservices can use our artifacts to analyze and improve the evolvability of their systems as well as to gain a conceptual understanding of service-based evolvability assurance.
Polyelectrolyte multilayer coatings (PEM) are prepared by alternative layer-by-layer deposition of cationic and anionic polyelectrolyte monolayers on charged surfaces. The thickness of the coatings ranges from nm to few μm. Their properties such as roughness, stiffness, surface charge and surface energy can be precisely tuned to fulfil different technical or biological requirements. The coating process is based on self-assembly of polyelectrolytes. Advantages of these coatings are their easy handling, no harsh chemistry and the possibility for coatings on complex geometries. The PEM coatings can be prepared from a variety of suitable polyelectrolytes. Their stability varies from very durable PEM coatings that are only soluble in strong solvents to quickly degradable, which may be applied as drug release system. One example of such a degradable PEM system is the one based on the polyelectrolyte pair Hyaluronan (HA) and Chitosan (CHI). These biopolymers originate from natural sources and show low toxicity towards human cells. However, HA/CHI multilayers show only weak adhesiveness for human umbilical vein endothelial cells (HUVEC). In this article, we summarize our approaches to enhance the HA/CHI multilayer by incorporation of a non-polymer substance –graphene oxide– to improve the cell adhesion and keep such properties as low cytotoxicity and biodegradability. Different approaches for incorporation of graphene oxide were performed and the cellular adhesion was tested by metabolic assay.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Drug-induced liver toxicity is one of the most common reasons for the failure of drugs in clinical trials and frequent withdrawal from the market. Reasons for such failures include the low predictive power of in vivo studies, that is mainly caused by metabolic differences between humans and animals, and intraspecific variances. In addition to factors such as age and genetic background, changes in drug metabolism can also be caused by disease-related changes in the liver. Such metabolic changes have also been observed in clinical settings, for example, in association with a change in liver stiffness, a major characteristic of an altered fibrotic liver. For mimicking these changes in an in vitro model, this study aimed to develop scaffolds that represent the rigidity of healthy and fibrotic liver tissue. We observed that liver cells plated on scaffolds representing the stiffness of healthy livers showed a higher metabolic activity compared to cells plated on stiffer scaffolds. Additionally, we detected a positive effect of a scaffold pre-coated with fetal calf serum (FCS)-containing media. This pre-incubation resulted in increased cell adherence during cell seeding onto the scaffolds. In summary, we developed a scaffold-based 3D model that mimics liver stiffness-dependent changes in drug metabolism that may more easily predict drug interaction in diseased livers.
Endogenous electrical fields play an important role in various physiological and pathological events. Yet the effects of electrical cues on processes such as wound healing, tumor development or metastasis are still rarely investigated, though it is known that direct current electrical fields can alter cell migration or proliferation in vitro. Several 2D experimental models for studying cell responses to direct current electrical fields have been presented and characterized but suitable experimental models for electrotaxis studies in 3D are rare. Here we present a novel, easy-to-produce, multi-well-based galvanotactic-chamber for the use in 2D and 3D cell experiments for investigations on the influence of electrical fields on tumor cell migration and tumor spheroid growth. Our presented system allows the simultaneous application of electrical field to cells in four chambers, either cultured on the bottom of the culture-plate (2D) or embedded in hydrogel filled channels(3D). The set-up is also suitable for, live-cell-imaging. Validation tests show stable electrical fields and high cell viabilities inside the channel. Tumor spheroids of various diameters can be exposed to direct current electrical fields up to one week.
Thermoplastic polycarbonate urethane elastomers (TPCU) are potential implant materials for treating degenerative joint diseases thanks to their adjustable rubber-like properties, their toughness, and their durability. We developed a water-containing high-molecular-weight sulfated hyaluronic acid-coating to improve the interaction of TPCU with the synovial fluid. It is suggested that trapped synovial fluid can act as a lubricant that reduces the friction forces and thus provides an enhanced abrasion resistance of TPCU implants. Aims of this work were (i) the development of a coating method for novel soft TPCU with high-molecular sulfated hyaluronic acid to increase the biocompatibility and (ii) the in vitro validation of the functionalized TPCUs in cell culture experiments.
Medical implants play a central role in modern medicine and both, naturally derived and synthetic materials have been explored as biomaterials for such devices. However, when implanted into living tissue, most materials initiate a host response. In addition, implants often cause bacterial infections leading to complications. Polyelectrolyte multilayer (PEM) coatings can be used for functionalization of medical implants improving the implant integration and reducing foreign body reactions. Some PEMs are also known to show antibacterial properties. We developed a PEM coating suggesting that it can decrease the risk of bacterial infections occurring after implantation while being highly biocompatible. We applied two different standard tests for evaluating the PEM’s antibacterial properties, the ISO norm (ISO 22196) and one ASTM norm (ASTM E2180) test. We found a reduction of bacterial growth on the PEM but to a different degree depending on the testing method. This result demonstrates the need for defining proper method to evaluate antibacterial properties of surface coatings.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Globalization has increased the number of road trips and vehicles. The result has been an intensification of traffic accidents, which are becoming one of the most important causes of death worldwide. Traffic accidents are often due to human error, the probability of which increases when the cognitive ability of the driver decreases. Cognitive capacity is closely related to the driver’s mental state, as well as other external factors such as the CO2 concentration inside the vehicle. The objective of this work is to analyze how these elements affect driving. We have conducted an experiment with 50 drivers who have driven for 25 min using a driving simulator. These drivers completed a survey at the start and end of the experiment to obtain information about their mental state. In addition, during the test, their stress level was monitored using biometric sensors and the state of the environment (temperature, humidity and CO2 level) was recorded. The results of the experiment show that the initial level of stress and tiredness of the driver can have a strong impact on stress, driving behavior and fatigue produced by the driving test. Other elements such as sadness and the conditions of the interior of the vehicle also cause impaired driving and affect compliance with traffic regulations.
Comparison of sleep characteristics measurements: a case study with a population aged 65 and above
(2020)
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Die OLED-Technologie wurde vor über zehn Jahren als Revolution in der Verpackungs-industrie gefeiert, die jedoch in der Praxis ausblieb. In einem industriellen Kooperations-projekt zur Zukunftsszenarienentwicklung der pharmazeutischen Verpackungsindustrie stellt sich die OLED-Technologie als Schlüsseltechnologie für das Zukunftsszenario Smart Packaging 2.0 dar.
High Performance Computing (HPC) enables significant progress in both science and industry. Whereas traditionally parallel applications have been developed to address the grand challenges in science, as of today, they are also heavily used to speed up the time-to-result in the context of product design, production planning, financial risk management, medical diagnosis, as well as research and development efforts. However, purchasing and operating HPC clusters to run these applications requires huge capital expenditures as well as operational knowledge and thus is reserved to large organizations that benefit from economies of scale. More recently, the cloud evolved into an alternative execution environment for parallel applications, which comes with novel characteristics such as on-demand access to compute resources, pay-per-use, and elasticity. Whereas the cloud has been mainly used to operate interactive multi-tier applications, HPC users are also interested in the benefits offered. These include full control of the resource configuration based on virtualization, fast setup times by using on-demand accessible compute resources, and eliminated upfront capital expenditures due to the pay-per-use billing model. Additionally, elasticity allows compute resources to be provisioned and decommissioned at runtime, which allows fine-grained control of an application's performance in terms of its execution time and efficiency as well as the related monetary costs of the computation. Whereas HPC-optimized cloud environments have been introduced by cloud providers such as Amazon Web Services (AWS) and Microsoft Azure, existing parallel architectures are not designed to make use of elasticity. This thesis addresses several challenges in the emergent field of High Performance Cloud Computing. In particular, the presented contributions focus on the novel opportunities and challenges related to elasticity. First, the principles of elastic parallel systems as well as related design considerations are discussed in detail. On this basis, two exemplary elastic parallel system architectures are presented, each of which includes (1) an elasticity controller that controls the number of processing units based on user-defined goals, (2) a cloud-aware parallel execution model that handles coordination and synchronization requirements in an automated manner, and (3) a programming abstraction to ease the implementation of elastic parallel applications. To automate application delivery and deployment, novel approaches are presented that generate the required deployment artifacts from developer-provided source code in an automated manner while considering application-specific non-functional requirements. Throughout this thesis, a broad spectrum of design decisions related to the construction of elastic parallel system architectures is discussed, including proactive and reactive elasticity control mechanisms as well as cloud-based parallel processing with virtual machines (Infrastructure as a Service) and functions (Function as a Service). To evaluate these contributions, extensive experimental evaluations are presented.
The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2020) continued a series of events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings: revenue-generating solutions that leverage digital technologies to address customer needs. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This briefing describes how Munich Re addressed these common challenges by building a foundation for experimenting more systematically and successfully with digital offerings. The foundation has enabled Munich Re to become a serial innovator of digital offerings.
Lean Management hat in viele Unternehmen Einzug gehalten. Lean Konzepte stellen neue Anforderungen an die Art und Struktur der benötigten Kosteninformation, welche von traditionallen Kostenrechnungssystemen nicht unmittelbar erfüllt werden. Vertreter eines „Lean Accounting“ schlagen deshalb teils radikale Änderungen und eine Vereinfachung der Kostenrechnung vor. Der Beitrag diskutiert die Beschränkungen der traditionellen Kostenrechnung bei der Umsetzung von Lean Management und stellt ausgewählte Ansätze eines „Accounting for Lean“ vor. Die Analyse zeigt, dass Ansätze des Lean Accounting zu eng fokussiert sind und die in der Praxis vorhandene Pluralität der Kostenrechnungsfunktionen nicht adäquat abbilden können. Eine radikale Neugestaltung bestehender Kostenrechnungssysteme wird deshalb als unrealistisch und unbegründet verworfen. Der Beitrag entwickelt alternative Vorschläge, wie Konzepte des Lean Managements und die dafür benötigte Kosteninformation in traditionellen Kostenrechnungssystemen integriert werden können.
Industrielle Produktionseinrichtungen haben mit rund 40% einen signifikanten Anteil am Gesamtenergiebedarf in Deutschland. Daher wurden und werden sie sowohl technologisch als auch energetisch optimiert. Häufig geht die technologischwirtschaftliche Optimierung auch mit der Reduzierung des Energie und Materialverbrauchs einher. Zudem macht der Ausbau der regenerativen Energiequellen die Energieerzeugung zunehmend volatiler, sodass nicht nur die Senkung des absoluten Energieverbrauchs, sondern auch eine höhere Flexibilität (Steuerung der Leistung über der Zeit) zunehmend interessanter wird. Dadurch ändert sich oft die installierte Leistung sowie die Gestaltung der Verlustleistungsabfuhr, was die Dimensionierung von Anlagen, zum Beispiel von spanenden Werkzeugmaschinen, beeinflusst.
A holistic approach to digitization enables decision-makers to achieve new efficiency in corporate performance management. The digitalization improves the quality, validity and speed of information retrieval and processing. At present, most corporations are confronted with the problem of not being able to organize, categorize and visualize decision-relevant information. To meet the challenges of information management, the Management Cockpit provides an information center for managers. In accordance with the specific working environment of the executives, the Management Cockpit offers a quick and comprehensive overview of the company's situation. Today, the current situation of a company is no longer only influenced by internal factors, but also by its public image. Social media monitoring and analysis is therefore a crucial component for the external factors of successful management. Real-time monitoring of the emotions and behaviors of consumers and customers thus contributes to effective controlling of allbusiness areas. The intelligent factories promise to collect data for internal factors, but the current reality in manufacturing looks different. Production often consists of a large number of different machines, with varying degrees of digitization and limited sensor data availability. In order to close this gap, we developed a compact sensor board with network components, which allows a flexible design with different sensors for a wide variety of applications. The sensor data enable decision makers to adapt the supply chain based on their internal and external observations in the Management Cockpit. Due to the realtime and long-term monitoring and analytic possibilities the Management Cockpit provides a multi-dimensional view of the company and supports an holistic Corporate Performance Management.
The article studies a novel approach of inflation modeling in economics. We utilize a stochastic differential equation (SDE) of the form dXt=aXtdt+bXtdBtH, where dBtH is a fractional Brownian motion in order to model inflationary dynamics. Standard economic models do not capture the stochastic nature of inflation in the Eurozone. Thus, we develop a new stochastic approach and take into consideration fractional Brownian motions as well as Lévy processes. The benefits of those stochastic processes are the modeling of interdependence and jumps, which is equally confirmed by empirical inflation data. The article defines and introduces the rules for stochastic and fractional processes and elucidates the stochastic simulation output.
Resilienz und Stabilität? Weichenstellungen im Banken- und Finanzsystem in der Corona-Pandemie
(2020)
Seit der globalen Finanzkrise 2008/2009 hat es keine vergleichbare Herausforderung wie die Corona-Krise für das Finanz- und Bankensystem mehr gegeben.
Schwache Profitabilität, ungelöste Regulierungs-herausforderungen und steigende Konkurrenz im Digitalbereich stellen die Banken vor weitere Heraus-forderungen.
Die Stabilität des Finanzsystems und der Zugang zu den Finanzmärkten war während der Pandemie nicht gefährdet. Durch gemeinsame Bemühungen und bes-sere Bankenkapitalisierung ist das Finanzsystem heute widerstandsfähiger als zu Zeiten der Finanzkrise.
Sofern die Zuschüsse und Kredite im „Next Genera-tion EU“-Fund zielgerichtet für Strukturreformen und Zukunftsinvestitionen eingesetzt werden, dürfte dies einen Vertrauens- und Wachstumsimpuls darstellen.
Weitere Verbesserungen der Finanzstabilität, wie erhöhte Eigenkapitalunterlegungen, Regulierung von Schattenbanken oder Reformen im Bereich der Finanzaufsicht, sind jedoch von Nöten.
Since the global financial crisis of 2008/2009, there has been no challenge to the financial and banking system comparable to that during the Corona crisis.
Weak profitability, unresolved regulatory challenges and increasing competition in the digital sector pose further challenges for banks.
The stability of the financial system and access to financial markets was not at risk during the pandemic. Through joint efforts and better bank capitalisation, the financial system is now more resilient than during the financial crisis.
Provided that grants and loans in the “next generation EU” fund are well targeted for structural reforms and investments in the future, this should boost confi-dence and growth.
However, further improvements in financial stability, such as increased capital requirements, regulation of shadow banks or reforms in financial supervision, are needed.
This paper studies the impact of financial liquidity on the macro-economy. We extend a classic macroeconomic modeland compute numerical simulations. The model confirms that persistently low inflation can occur despite a high degreeof financial liquidity due to a reallocation of cash, normal and risk-free bonds. In that regard, our model uncovers anexplanation of a flat Phillips curve. Overall, our approach contributes to a rather disregarded matter in macroeconomictheory.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
Autonomous driving is becoming the next big digital disruption in the automotive industry. However, the possibility of integrating autonomous driving vehicles into current transportation systems not only involves technological issues but also requires the acceptance and adoption of users. Therefore, this paper develops a conceptual model for user acceptance of autonomous driving vehicles. The corresponding model is tested through a standardized survey of 470 respondents in Germany. Finally, the findings are discussed in relation to the current developments in the automotive industry, and recommendations for further research are given.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? And (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? and (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
Im Rahmen des Forschungsprojektes wurden Ausrüstungsmittel und -verfahren entwickelt, die dem vorbeugenden Schutz von Textilien (insbesondere Bodenbelägen) vor Anschmutzung dienen. Das Verfahren sieht eine kombinierte Ausrüstung von Textilien mit fluorierten Polymeren mit inkorporierten Nanopartikeln (in erster Linie: SiO2) zur Erhöhung der Rauhigkeit vor. Es wurden kommerziell erhältliche Hydrophobiermittel (Fluorcarbon- oder Kohlenwasserstoff-basierte Polymere) in Kombination mit SiO2-Nanopartikeln auf Teppiche aufgebracht und hinsichtlich eines Anschmutzens - z.B. durch Kaffee, KoolAid, Rotwein, AATCC Standard Soil, schwarze Schuhcreme - untersucht. Hierzu wurden die scherempfindlichen Dispersionen der Hydrophobiermittel mit neu entwickelten angepassten Dispersionen von SiO2-Nanopartikel versetzt. Die SiO2-Nanopartikel wurden mit systematisch variierten Größen von 10-1.000 nm synthetisiert, umfassend charakterisiert und mit Hilfe von neu entwickelten Fluormethacrylat-Copolymeren mit reaktiven Gruppen (Maleinsäure-, Itaconsäure- oder Citraconsäureanhydrid) und hydrophilen Modifiern (Alkohol- oder Amingruppen) stabilisiert. Die resultierenden Polymer-Teilchen-Dispersionen konnten aus wässrigen oder ethanolisch-wässrigen Lösungen auf Textilien (PA-, PES- oder WO-Teppiche und -Gewebe) appliziert werden. Weiterhin wurden auch die neu entwickelten Fluorcarbon-Polymere hinsichtlich ihrer Anwendung getestet. In Anschmutzungsversuchen wiesen die so ausgerüsteten Teppiche ein geringeres Anschmutzen durch Standardschmutz als Referenzmaterialien auf. Die Beständigkeit der Ausrüstung bei mechanischer Belastung konnte durch Vernetzung der Polymere auf dem Textilmaterial verbessert werden. Für PA 6- und PA 6.6-Teppiche wurden die besten Ergebnisse hinsichtlich eines geringeren Anschmutzens durch wasserlösliche Verschmutzungen (Kaffee, Rotwein, KoolAid) im Vergleich zu unbehandelten Teppichen ermittelt, wenn die Ausrüstung mit Fluorpolymer-stabilisierten SiO2-Nanopartikeln oder mit einer kombinierten Dispersion aus SiO2-Partikeln und Fluorcarbonharzen vorgenommen wurde. Eine im Vergleich zu unbehandelten Teppichen weniger starke Anschmutzung durch AATCC Standard Soil (DIN EN ISO 11378-2) wurde für mit SiO2-Partikeln behandelte PA 6-Teppiche ermittelt. Hydrophobe Anschmutzungen (z.B. schwarze Schuhcreme) konnten von mit Fluorcarbon-Polymeren ausgerüsteten Teppichen am besten entfernt werden. Die Kombination von SiO2-Partikeln mit Fluorcarbon-Polymeren erwies sich meist als günstiger als die alleinige Behandlung mit Fluorcarbonharzen. Ein Zusammenhang zwischen der Größe der Nanopartikel, der Abrasionsbeständigkeit und den Reinigungseigenschaften wurde festgestellt, und es konnte gezeigt werden, dass FC-Nanopartikel-Composites diese verbessern. Die mechanische Beständigkeit der Antischmutzausrüstung mit SiO2-Nanopartikeln und Fluorcarbon-Polymeren auf Polyamidteppichen wurde z.B. durch Hexapod-Trommelbeanspruchung (nach ISO 10361) geprüft. Durch REM, IR-Spektroskopie und den Wassertropfentest wurde nach 4.000 und auch nach 12.000 Touren noch eine intakte Beschichtung nachgewiesen. Mit Vernetzern, die das Polymer selbst, das Polymer mit Partikeln und/oder der Substratoberfläche vernetzen, konnte z.T. die Abrasionsbeständigkeit verbessert werden (hier müssen ggf. optimalere Vernetzer gesucht werden).
Die Ausrüstung von Textilien mit Sol-Gel-Beschichtungen wird seit einigen Jahren intensiv verfolgt. Eine Vielzahl von bekannten, aber auch neuen Ausrüstungseffekten können über diesen Ansatz realisiert werden. Besonders interessant ist die Sol-Gel-Technik wegen der Möglichkeiten, multifunktionelle Ausrüstungen zu synthetisieren. Problematisch ist eine in vielen Fällen geringe Beständigkeit solcher Ausrüstungen, insbesondere gegenüber Waschprozessen. Ziel des Projektes war es davon ausgehend, Vorbehandlungsstrategien für textile Fasermaterialien, basierend auf synthetischen Polymeren oder aus Naturfasern, zu entwickeln, die die Haltbarkeit von Sol-Gel-basierten Ausrüstungen verbessern. Im Rahmen der Arbeiten wurden, angepasst an die jeweiligen Faserpolymere - Polyethylenterephthalat, Polyamid, Polypropylen und Baumwolle - funktionelle Gruppen über geeignete Anker auf den Polymeren etabliert, die in der Lage sind, kovalente Bindungen zu Sol-Gel-basierten Beschichtungssystemen auszubilden. Als Anker wurden primär Trialkoxysilane verwendet, die zusätzlich z.B. Epoxy-, Isocyanato-, Azido- oder Amino-funktionelle Reste besitzen. Mit diesen Resten können die Anker kovalent an die Polymere angebunden werden. Die meisten Sol-Gel-basierten Systeme enthalten zumindest zu einem gewissen Anteil SiOx und/oder MexOy-Cluster. Die zur Funktionalisierung der Oberflächen eingesetzten Alkoxysilane können generell an solche Systeme/Cluster per Kondensation gebunden werden und dienen daher für die effektive Anbindung verschiedenster funktioneller Sol-Gel-Schichten. Entsprechend vorfunktionalisierte Substrate wurden in der Folge mit exemplarisch ausgewählten Sol-Gel-Ausrüstungen beschichtet. Dabei wurden für den Großteil der Untersuchungen hydrophobierende Sole appliziert. Vorteilhaft ist, dass sich der mit hydrophobierenden Solen erzielte Ausrüstungseffekt genau wie dessen Beständigkeit mit vergleichsweise überschaubarem Aufwand über die Untersuchung der Benetzbarkeit (DuPont-Noten, Kontaktwinkel, Tropfeneinsinkzeiten) charakterisieren lässt. Die Wirksamkeit der Vorbehandlungen wurde dann vor allem anhand von Untersuchungen zur Waschbeständigkeit der Ausrüstungen überprüft. Im Rahmen der Arbeiten konnte gezeigt werden, dass sich über die Etablierung geeigneter Anker die Beständigkeit von Sol-Gel-Ausrüstungen bzw. der daraus hervorgehenden Effekte verbessern lässt. Es zeigt sich gleichzeitig, dass die erzielten Verbesserungen sehr stark vom jeweiligen Sol abhängen. D.h., dass sich erzielte Verbesserungen nicht zwangsläufig auf andere Sol übertragen lassen. Analytische Charakterisierungen weisen darauf hin, dass in vielen Fällen die Beständigkeit der Beschichtungsnetzwerke selbst einen weit größeren Einfluss besitzt als die Anbindung an das Substrat. So zeigt sich bei verschiedenen Untersuchungen, dass die Auflage der Sol-Gel-Beschichtung vor allem nach einer ersten Wäsche, aber auch darüber hinaus, signifikant sinkt, oftmals aber ohne dass der durch Ausrüstung erzielte Effekt verloren geht. Dies deutet auf ein (Auf-)Lösen der Beschichtungsmatrizes hin, wovor die Anker nicht schützen können, da deren Wirkung auf die Grenzfläche zum Substrat beschränkt ist. Neben den hydrophobierenden Ausrüstungen wurden exemplarisch auch antibakterielle Ausrüstungen nach den entsprechenden Vorbehandlungen appliziert. Auch hier konnten Verbesserungen in der Beständigkeit des Effektes erzielt werden. Abschließend wurde untersucht inwieweit sich die Vorbehandlungen im Vergleich zur einfachen Ausrüstung negativ auf die textilen Produkte auswirken. Hierzu wurden relevante textile Parameter wie z.B. Höchstzugkräfte, Weißgrade, Steifigkeit oder Luftdurchlässigkeiten bestimmt. Diese Parameter wurden in der überwiegenden Zahl der Vorbehandlungen nicht oder nur geringfügig beeinflusst.
Hochschulen sind Teil des Innovationsökosystems: in einer kooperativen Austauschbeziehung fördern sie die regionale Wirtschaft und die gesellschaftliche Entwicklung. Deshalb ist die Förderung von Innovation, Kreativität und unternehmerischem Denken eine wichtige Aufgabe. Die Europäische Kommission hat bereits 2005 unternehmerisches Denken und Handeln als Schlüsselkompetenz für das 21. Jahrhundert definiert: „Unternehmerische Kompetenz ist die Fähigkeit, Ideen in die Tat umzusetzen“ (Europäische Kommission, 2005, S. 21). Entrepreneurship Education boomt und die Förderung von unternehmerischen Kompetenzen an Hochschulen wird vorangetrieben – damit ist die Förderung von Gründungskultur nicht nur Teil der Wirtschaftsbildung sondern vielmehr als Querschnittsaufgabe zu verstehen. Die Entrepreneurial Mission verändert die Lehr- und Lern kultur an den Hochschulen. Zum einen ist es Ziel, Entrepreneurship in der Breite an den Hochschulen zu verankern: Unternehmerisches Denken und Handeln ist eine Kernkompetenz. Zum anderen fördert die Start-up Education an Hochschulen aktiv Unternehmertalente und Ausgründungen.
Das Projekt “Spinnovation” ist ein Verbundprojekt der Hochschule Reutlingen, der Hochschule Aalen und der Hochschule der Medien und wird vom Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg in der Ausschreibung „Gründungskultur in Studium und Lehre“ gefördert. Seit 2016 wurden dazu an den beteiligten Hochschulen zahlreiche neue Angebote für Studierende entwickelt, um das Thema Entrepreneurship Education curricular zu integrieren und eine Änderung des Mindsets in Richtung Entrepreneurship und Innovation zu bewirken. Basierend auf den Erfahrungen und Ergebnissen aus dem Verbundprojekt Spinnovation können konkrete Handlungsempfehlungen für die Entrepreneurship Education an Hochschulen abgeleitet werden.
With the expansion of cyber-physical systems (CPSs) across critical and regulated industries, systems must be continuously updated to remain resilient. At the same time, they should be extremely secure and safe to operate and use. The DevOps approach caters to business demands of more speed and smartness in production, but it is extremely challenging to implement DevOps due to the complexity of critical CPSs and requirements from regulatory authorities. In this study, expert opinions from 33 European companies expose the gap in the current state of practice on DevOps-oriented continuous development and maintenance. The study contributes to research and practice by identifying a set of needs. Subsequently, the authors propose a novel approach called Secure DevOps and provide several avenues for further research and development in this area. The study shows that, because security is a cross-cutting property in complex CPSs, its proficient management requires system-wide competencies and capabilities across the CPSs development and operation.
Tissue constructs of physiologically relevant scale require a vascular system to maintain cell viability. However, in vitro vascularization of engineered tissues is still a major challenge. Successful approaches are based on a feeder layer (FL) to support vascularization. Here, we investigated whether the supporting effect on the self‐assembled formation of prevascular‐like structures by microvascular endothelial cells (mvECs) originates from the FL itself or from its extracellular matrix (ECM). Therefore, we compared the influence of ECM, either derived from adipose‐derived stem cells (ASCs) or adipogenically differentiated ASCs, with the classical cell‐based FL. All cell‐derived ECM (cdECM) substrates enabled mvEC growth with high viability. Prevascular‐like structures were visualized by immunofluorescence staining of endothelial surface protein CD31 and could be observed on all cdECM and FL substrates but not on control substrate collagen I. On adipogenically differentiated ECM, longer and higher branched structures could be found compared with stem cell cdECM. An increased concentration of proangiogenic factors was found in cdECM substrates and FL approaches compared with controls. Finally, the expression of proteins associated with tube formation (E‐selectin and thrombomodulin) was confirmed. These results highlight cdECM as promising biomaterial for adipose tissue engineering by inducing the spontaneous formation of prevascular‐like structures by mvECs.
Elasticity is considered to be the most beneficial characteristic of cloud environments, which distinguishes the cloud from clusters and grids. Whereas elasticity has become mainstream for web-based, interactive applications, it is still a major research challenge how to leverage elasticity for applications from the high-performance computing (HPC) domain, which heavily rely on efficient parallel processing techniques. In this work, we specifically address the challenges of elasticity for parallel tree search applications. Well-known meta-algorithms based on this parallel processing technique include branch-and-bound and backtracking search. We show that their characteristics render static resource provisioning inappropriate and the capability of elastic scaling desirable. Moreover, we discuss how to construct an elasticity controller that reasons about the scaling behavior of a parallel system at runtime and dynamically adapts the number of processing units according to user-defined cost and efficiency thresholds. We evaluate a prototypical elasticity controller based on our findings by employing several benchmarks for parallel tree search and discuss the applicability of the proposed approach. Our experimental results show that, by means of elastic scaling, the performance can be controlled according to user-defined thresholds, which cannot be achieved with static resource provisioning.
Different types of raw cotton were investigated by a commercial ultraviolet-visible/near infrared (UV-Vis/NIR) spectrometer (210–2200 nm) as well as on a home-built setup for NIR hyperspectral imaging (NIR-HSI) in the range 1100–2200 nm. UV-Vis/NIR reflection spectroscopy reveals the dominant role proteins, hydrocarbons and hydroxyl groups play in the structure of cotton. NIR-HSI shows a similar result. Experimentally obtained data in combination with principal component analysis (PCA) provides a general differentiation of different cotton types. For UV-Vis/NIR spectroscopy, the first two principal components (PC) represent 82 % and 78 % of the total data variance for the UV-Vis and NIR regions, respectively. Whereas, for NIR-HSI, due to the large amount of data acquired, two methodologies for data processing were applied in low and high lateral resolution. In the first method, the average of the spectra from one sample was calculated and in the second method the spectra of each pixel were used. Both methods are able to explain ≥90 % of total variance by the first two PCs. The results show that it is possible to distinguish between different cotton types based on a few selected wavelength ranges. The combination of HSI and multivariate data analysis has a strong potential in industrial applications due to its short acquisition time and low-cost development. This study opens a novel possibility for a further development of this technique towards real large-scale processes.
The data presented in this article characterize the thermomechanical and microhardness properties of a novel melamine-formaldehyde resin (MF) intended for the use as a self-healing surface coating. The investigated MF resin is able to undergo reversible crosslinking via Diels Alder reactive groups. The microhardness data were obtained from nanoindentation measurements performed on solid resin film samples at different stages of the self-healing cycle. Thermomechanical analysis was performed under dynamic load conditions. The data provide supplemental material to the manuscript published by Urdl et al. 2020 (https://doi.org/10.1016/j.eurpolymj.2020.109601) on the self-healing performance of this resin, where a more thorough discussion on the preparation, the properties of this coating material and its application in impregnated paper-based decorative laminates can be found.
Thermoplastic polymers like ethylene-octene copolymer (EOC) may be grafted with silanes via reactive extrusion to enable subsequent crosslinking for advanced biomaterials manufacture. However, this reactive extrusion process is difficult to control and it is still challenging to reproducibly arrive at well-defined products. Moreover, high grafting degrees require a considerable excess of grafting reagent. A large proportion of the silane passes through the process without reacting and needs to be removed at great expense by subsequent purification. This results in unnecessarily high consumption of chemicals and a rather resource-inefficient process. It is thus desired to be able to define desired grafting degrees with optimum grafting efficiency by means of suitable process control. In this study, the continuous grafting of vinyltrimethoxysilane (VTMS) on ethylene-octene copolymer (EOC) via reactive extrusion was investigated. Successful grafting was verified and quantified by 1H-NMR spectroscopy. The effects of five process parameters and their synergistic interactions on grafting degree and grafting efficiency were determined using a face-centered experimental design (FCD). Response surface methodology (RSM) was applied to derive a causal process model and define process windows yielding arbitrary grafting degrees between <2 and >5% at a minimum waste of grafting agent. It was found that the reactive extrusion process was strongly influenced by several second-order interaction effects making this process difficult to control. Grafting efficiencies between 75 and 80% can be realized as long as grafting degrees <2% are admitted.
Das Urteil des Bundesverfassungsgerichts (BVerfG) vom 5. Mai 2020 ist Schlusspunkt und zugleich Neuanfang nach einer jahrelangen verfassungsrechtlichen und ökonomischen Auseinandersetzung. Im Prinzip geht es um die konstituierenden Prinzipien der Eurozone sowie das Mandat der Europäischen Zentralbank (EZB). Der EU-Vertrag charakterisiert die Leitplanken der Wirtschafts- und Währungsunion (WWU) im Spannungsfeld der Art. 119, 123 und 125 des Vertrags über die Arbeitsweise der Europäischen Union (AEUV). Mithin liegt die wirtschaftspolitische Souveränität – nach dem Prinzip Haftung und Kontrolle – allein bei den Mitgliedstaaten. Die Organe der Europäischen Union (EU) sowie der Gerichtshof der Europäischen Union (EuGH) legen diese Leitplanken gemäß dem Leitgedanken in Art. 1 des Vertrags über die Europäische Union (EUV) einer „ever closer union“ regelmäßig mit weitem Ermessen aus.
Some widely used optical measurement systems require a scan in wavelength or in one spatial dimension to measure the topography in all three dimensions. Novel hyperspectral sensors based on an extended Bayer pattern have a high potential to solve this issue as they can measure three dimensions in a single shot. This paper presents a detailed examination of a hyperspectral sensor including a description of the measurement setup. The evaluated sensor (Ximea MQ022HG-IM-SM5X5-NIR) offers 25 channels based on Fabry–Pérot filters. The setup illuminates the sensor with discrete wavelengths under a specified angle of incidence. This allows characterization of the spatial and angular response of every channel of each macropixel of the tested sensor on the illumination. The results of the characterization form the basis for a spectral reconstruction of the signal, which is essential to obtain an accurate spectral image. It turned out that irregularities of the signal response for the individual filters are present across the whole sensor.
Here, the effects of substituting portions of fossil-based phenol in phenol formaldehyde resin by renewable lignin from two different sources are investigated using a factorial screening experimental design. Among the resins consumed by the wood-based industry, phenolics are one of the most important types used for impregnation, coating or gluing purposes. They are prepared by condensing phenol with formaldehyde (PF). One major use of PF is as matrix polymer for decorative laminates in exterior cladding and wet-room applications. Important requirements for such PFs are favorable flow properties (low viscosity), rapid curing behavior (high reactivity) and sufficient self-adhesion capacity (high residual curing potential). Partially substituting phenol in PF with bio-based phenolic co-reagents like lignin modifies the physicochemical properties of the resulting resin. In this study, phenol-formaldehyde formulations were synthesized where either 30% or 50% (in weight) of the phenol monomer were substituted by either sodium lignosulfonate or Kraft lignin. The effect of modifying the lignin material by phenolation before incorporation into the resin synthesis was also investigated. The resins so obtained were characterized by Fourier Transform Infra-Red (FTIR) spectroscopy, Size Exclusion Chromatography (SEC), Differential Scanning Calorimetry (DSC), rheology, and measurements of contact angle and surface tension using the Wilhelmy plate method and drop shape analysis.
Improvement of a three-layered in vitro skin model for topical application of irritating substances
(2020)
In the field of skin tissue engineering, the development of physiologically relevant in vitro skin models comprising all skin layers, namely epidermis, dermis, and subcutis, is a great challenge. Increasing regulatory requirements and the ban on animal experiments for substance testing demand the development of reliable and in vivo-like test systems, which enable high-throughput screening of substances. However, the reproducibility and applicability of in vitro testing has so far been insufficient due to fibroblast-mediated contraction. To overcome this pitfall, an advanced 3-layered skin model was developed. While the epidermis of standard skin models showed an 80% contraction, the initial epidermal area of our advanced skin models was maintained. The improved barrier function of the advanced models was quantified by an indirect barrier function test and a permeability assay. Histochemical and immunofluorescence staining of the advanced model showed well-defined epidermal layers, a dermal part with distributed human dermal fibroblasts and a subcutis with round-shaped adipocytes. The successful response of these advanced 3-layered models for skin irritation testing demonstrated the suitability as an in vitro model for these clinical tests: only the advanced model classified irritative and non-irritative substances correctly. These results indicate that the advanced set up of the 3-layered in vitro skin model maintains skin barrier function and therefore makes them more suitable for irritation testing.
Impregnated paper-based decorative laminates prepared from lignin-substituted phenolic resins
(2020)
High Pressure Laminates (HPL) panels consist of stacks of self-gluing paper sheets soaked with phenol-formaldehyde (PF) resins. An important requirement for such PFs is that they must rapidly penetrate and saturate the paper pores. Partially substituting phenol with bio-based phenolic chemicals like lignin changes the physico-chemical properties of the resin and affects its ability to penetrate the paper. In this study, PF formulations containing different proportions of lignosulfonate and kraft lignin were used to prepare paper-based laminates. The penetration of a Kraft paper sheet was characterized by a recently introduced, new device measuring the conductivity between both sides of the paper sheet after a drop of resin was placed on the surface and allowed to penetrate the sheet. The main target value measured was the time required for a specific resin to completely penetrate the defined paper sample (“penetration time”). This penetration time generally depends on the molecular weight distribution, the flow behavior and the polarity of the resin which in turn are dependent on the manufacturing conditions of the resin. In the present study, the influences of the three process factors: (1) type of lignin material used for substitution, (2) lignin modification by phenolation and (3) degree of phenol substitution on the penetration times of various lignin-phenolic hybrid impregnation resins were studied using a complete twolevel three-factorial experimental design. Thin laminates made with the resins diluted in methanol were mechanically tested in terms of tensile and flexural strains, and their cross-sections were studied by light microscopy.
Für den Unternehmer wichtig ist, binnen welcher Fristen er als Käufer seine Rechte bei Sachmängeln geltend machen muss. Ist der Unternehmer Verkäufer, kann er sich erst nach Verjährungsvollendung endgültig zurücklehnen und sicher sein, dass keine Gewährleistungsansprüche gegen ihn mehr geltend gemacht werden können. Im nationalen Rechtsverkehr hat man sich auf Verkäufer- und Käuferseite mittlerweile an die zweijährige Regelfrist im BGB gewöhnt. Welche Fristen im Auslandsgeschäft gelten, ist dagegen oft unklar, weil sich die nationalen Verjährungsfristen oft unterscheiden: Allein in Europa gibt es bei der kaufrechtlichen Gewährleistung Verjährungsfristen zwischen sechs Monaten und sechs Jahren.
Bone tissue is highly vascularized. The crosstalk of vascular and osteogenic cells is not only responsible for the formation of the strongly divergent tissue types but also for their physiological maintenance and repair. Extrusion-based bioprinting presents a promising fabrication method for bone replacement. It allows for the production of large-volume constructs, which can be tailored to individual tissue defect geometries. In this study, we used the all-gelatin-based toolbox of methacryl-modified gelatin (GM), non-modified gelatin (G) and acetylated GM (GMA) to tailor both the properties of the bioink towards improved printability, and the properties of the crosslinked hydrogel towards enhanced support of vascular network formation by simple blending. The vasculogenic behavior of human dermal microvascular endothelial cells (HDMECs) and human adipose-derived stem cells (ASCs) was evaluated in the different hydrogel formulations for 14 days. Co-culture constructs including a vascular component and an osteogenic component (i.e. a bone bioink based on GM, hydroxyapatite and ASCs) were fabricated via extrusion-based bioprinting. Bioprinted co-culture constructs exhibited functional tissue-specific cells whose interplay positively affected the formation and maintenance of vascular-like structures. The setup further enabled the deposition of bone matrix associated proteins like collagen type I, fibronectin and alkaline phosphatase within the 30-day culture.
Azide-bearing cell-derived extracellular matrices (“clickECMs”) have emerged as a highly exciting new class of biomaterials. They conserve substantial characteristics of the natural extracellular matrix (ECM) and offer simultaneously small abiotic functional groups that enable bioorthogonal bioconjugation reactions. Despite their attractiveness, investigation of their biomolecular composition is very challenging due to the insoluble and highly complex nature of cell-derived matrices (CDMs). Yet, thorough qualitative and quantitative analysis of the overall material composition, organisation, localisation, and distribution of typical ECM-specific biomolecules is essential for consistent advancement of CDMs and the understanding of the prospective functions of the developed biomaterial. In this study, we evaluated frequently used methods for the analysis of complex CDMs. Sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) and (immune)histochemical staining methods in combination with several microscopic techniques were found to be highly eligible. Commercially available colorimetric protein assays turned out to deliver inaccurate information on CDMs. In contrast, we determined the nitrogen content of CDMs by elementary analysis and converted it into total protein content using conversion factors which were calculated from matching amino acid compositions. The amount of insoluble collagens was assessed based on the hydroxyproline content. The Sircol™ assay was identified as a suitable method to quantify soluble collagens while the Blyscan™ assay was found to be well-suited for the quantification of sulphated glycosaminoglycans (sGAGs). Eventually, we propose a series of suitable methods to reliably characterise the biomolecular composition of fibroblast-derived clickECM.
The extracellular matrix (ECM) naturally surrounds cells in humans, and therefore represents the ideal biomaterial for tissue engineering. ECM from different tissues exhibit different composition and physical characteristics. Thus, ECM provides not only physical support but also contains crucial biochemical signals that influence cell adhesion, morphology, proliferation and differentiation. Next to native ECM from mature tissue, ECM can also be obtained from the in vitro culture of cells. In this study, we aimed to highlight the supporting effect of cell-derived- ECM (cdECM) on adipogenic differentiation. ASCs were seeded on top of cdECM from ASCs (scdECM) or pre-adipocytes (acdECM). The impact of ECM on cellular activity was determined by LDH assay, WST I assay and BrdU assay. A supporting effect of cdECM substrates on adipogenic differentiation was determined by oil red O staining and subsequent quantification. Results revealed no effect of cdECM substrates on cellular activity. Regarding adipogenic differentiation a supporting effect of cdECM substrates was obtained compared to control. With these results, we confirm cdECM as a promising biomaterial for adipose tissue engineering.
JumpAR kombiniert die Welt der Augmented Reality (AR) mit dem weltbekannten Jump ’n’ Run Genre in einem Mobile Game. Der Spieler kreiert einen individuellen Spielparcours in seiner realen Umgebung und navigiert seine Spielfigur auf virtuellen Plattformen durch diesen. Der mit Unity entwickelte JumpAR Prototyp wurde nach Umsetzungen der Grundfunktionen und Mechaniken im Rahmen eines Nutzertests analysiert. Die Integration von echten Gegenständen aus dem Umfeld des Spielers führt im Spielfluss zu einer starken Verknüpfung der virtuellen und realen Welt, was eine neue AR-Interaktionsform für Handyspiele darstellt.
Systemische Betrachtung des therapeutischen Roboters Paro im Vergleich zu dem Haustierroboter AIBO
(2020)
Roboter sind in der heutigen Zeit nicht nur in der Industrie zu finden, sondern werden immer häufiger in privaten Lebensbereichen eingesetzt. Ein Beispiel hierfür ist der soziale Therapie-Roboter Paro. Dieser ist dem Verhalten und Aussehen einer jungen Robbe nachempfunden, drückt Gefühle aus und wird besonders in Pflegeheimen eingesetzt. Dabei zeigt er positive Auswirkungen auf das Wohlbefinden pflegebedürftiger Menschen. Diese Arbeit stellt den Roboter Paro in einer systemischen Analyse dar: hierbei werden Systemkontext, Anwendungsfälle, Anforderungen und Struktur betrachtet. Anschließend erfolgt eine Analyse des Haustierroboters AIBO, welcher einem Welpen ähnelt und verstärkt der Unterhaltung von Privatpersonen dient. Es werden Gemeinsamkeiten und Unterschiede zwischen den Systemen herausgearbeitet. Dabei wird ersichtlich, dass beide Systeme dem Nutzer vorrangig Gesellschaft leisten, jedoch verschiedene Anforderungen besitzen und in unterschiedlichen Anwendungsdomänen eingesetzt werden. Zudem besitzt AIBO vielfältigere Fähigkeiten und einen höheren Bewegungsgrad als Paro. Dies spiegelt sich in einer komplexeren Struktur der Hardware wider.
A methodology for designing planar spiral antennas with a feeding network embedded within a dielectric is presented. To avoid a purely academic work which may not be manufactured with available standard technologies, the approach takes into account manufacturing process requirements by choice of used materials in the simulation. General design rules are provided. They encompass amongst others, selection criteria for dielectric material, aspects to consider when sketching the radiating element design, as well as those for the implementation of the feeding network. A rule of thumb, which maybe helpful in the determination of the antenna supporting substrate’s height, has been found. The appeal of the method resides in the fact that it eases up the design process and helps to minimize errors, saving time and money. The approach also enables the design of a compact and small-size spiral antenna as antenna-in-package (AiP), and provides the opportunity to assemble the antenna with other RF components/systems on the same layer stack or on the same integration platform.
In diesem Beitrag wird ein neuer Ansatz vorgestellt, welcher eine schwerkraftreduzierte Navigation innerhalb einer VR-Umgebung erlaubt, wie beispielsweise ein simulierter Mondspaziergang. Zur Navigation in der VR-Umgebung wird der Cyberith Virtualizer ein-gesetzt. Die Schwerkraftsimulation erfolgt mittels eines einstellbaren Gurtsystems, das anelastischen Seilen aufgehängt wird und abgestufte Schwerkraftkompensationen erlaubt. Als Umgebung wurde ein Raumschiffszenario sowie eine Mondoberfläche generiert. Hier sind in der aktuellen Anwendung einfache Interaktionen möglich. In Anlehnung an existierende Gravity Offload Systeme wird die Lösung ViRGOS bezeichnet. ViRGOS wurde bereits bei verschiedenen Besuchsterminen und Hochschulevents eingesetzt, so dass erste Rückmeldungen von Nutzern eingeholt werden konnten.
Here, we report the mechanical and water sorption properties of a green composite based on Typha latifolia fibres. The composite was prepared either completely binder-less or bonded with 10% (w/w) of a bio-based resin which was a mixture of an epoxidized linseed oil and a tall-oil based polyamide. The flexural modulus of elasticity, the flexural strength and the water absorption of hot pressed Typha panels were measured and the influence of pressing time and panel density on these properties was investigated. The cure kinetics of the biobased resin was analyzed by differential scanning calorimetry (DSC) in combination with the iso-conversional kinetic analysis method of Vyazovkin to derive the curing conditions required for achieving completely cured resin. For the binderless Typha panels the best technological properties were achieved for panels with high density. By adding 10% of the binder resin the flexural strength and especially the water absorption were improved significantly.
Today's pattern making methods for industrial purposes are including construction principles, which are based on mathematical formula and sizing charts. As a result, there are two-dimensional flats, which can be converted into a three-dimensional garment. Because of their high linearity, those patterns are incapable of recreating the complexity of the human body, which results in insufficient fit. Subsequent changes of the pattern require a high degree of experience and lead to an inefficient product development process. It is known that draping allows the development of more complex and demanding patterns, which corresponds more to the actual body shape. Therefore, this method is used in custom tailoring and haute couture to achieve perfect garment fit but is also associated with time.
So, there is the act of defiance to improve the fit of garments, to speed up production but maintain a good value for money. Reutlingen University is therefore working on the development of 3D-modelled body shapes for 3D draping, considering different layers of clothing, such as jackets or coats. For this purpose, 3D modelling is used to develop 3D-bodies that correspond to the finished dimensions of the garment. By flattening of the modelled body, it is then possible to obtain an optimal 2D Pattern of the body. The comparison of the conventional method and the developed method is done by 3D simulation.
Finally, the optical fit test is demonstrated by the simulated basic cuts, that a significantly better body wrapping through the newly developed methodology could be achieved. Unlike in the basic cuts, which were achieved by classical design principles have been created, only a few adjustments are necessary to obtain an optimized basic cut. Also, when considering the body distance, it is shown that the newly developed basic patterns provide a more even enclosure of the body.
The process for the production of customized bras is really challenging. Although the need is very clear, the lingerie industry is currently facing a lack of data, knowledge and expertise for the realization of an automated process chain. Different studies and surveys have shown, that the majority of women wear the incorrect bra size. In addition to aesthetic problems, health risks such as headaches, back problems or digestive problems of the wearers can result from this. An important prerequisite for improvements is the basic knowledge about the female breast, both in terms of body measurements and different breast shapes. The current size systematic for bras only defines a bra size by the relation between bust girth and underbust girth and standardized cup forms do not justice to the high variability of the human body. As the bra type shapes the female breast, basic knowledge about the relation of measurements and shapes from the clothed and the unclothed breast is missing.
In the present project, studies are conducted to explore the female breast and to derive new breast-specific body measurements, different breast shapes and deformation knowledge using existing bras.
Furthermore, an innovative process is being developed that leads from 3D scanning to individual and interactive pattern construction, which allows an automatic pattern creation based on individual body measurements and the influence of different material parameters.
In the course of the presentation, the current project status will be shown and the future developments and project steps will be introduced.
Gibt es einen Kauf-Knopf im Gehirn des Konsumenten? Und wenn ja, wie betätigt man diesen? Die Antworten auf diese Fragen könnte das Neuromarketing liefern. Das Neuromarketing ist Bestandteil der Neuroökonomie und eine relativ junge Disziplin an der Schnittstelle von Kognitionswissenschaften, Neurowissenschaften und der Marktforschung. Durch den technologischen Fortschritt können die Neurowissenschaften wichtige Erkenntnisse für das Marketing liefern, insbesondere Einblicke zur Erklärung des Konsumentenverhaltens. Durch den Blick in das Kundengehirn können beispielsweise Handelsunternehmen ihre Kunden gezielter ansprechen und sich so einen Vorteil gegenüber Konkurrenten verschaffen.
In Germany, mobility is currently in a state of flux. Since June 2019, electric kick scooters (e-scooters) have been permitted on the roads, and this market is booming. This study employs a user survey to generate new data, supplemented by expert interviews to determine whether such e-scooters are a climate-friendly means of transport. The environmental impacts are quantified using a life cycle assessment. This results in a very accurate picture of e-scooters in Germany. The global warming potential of an e-scooter calculated in this study is 165 g CO2-eq./km, mostly due to material and production (that together account for 73% of the impact). By switching to e-scooters where the battery is swapped, the global warming potential can be reduced by 12%. The lowest value of 46 g CO2-eq./km is reached if all possibilities are exploited and the life span of e-scooters is increased to 15 months. Comparing these emissions with those of the replaced modal split, e-scooters are at best 8% above the modal split value of 39 g CO2-eq./km.
Motto der Herbstkonferenz Informatics Inside 2020 ist KInside. Wieder einmal blicken Studierende inside und schauen sich Methoden, Anwendungen und Zusammenhänge genauer an. Die Beiträge sind vielfältig und entsprechend dem Studiengang human-centered. Es ist der Anspruch, dass sich die Themen um die Bedürfnisse der Menschen drehen und eingesetzte Methoden kein Selbstzweck sind, sondern am Nutzen für den Menschen gemessen werden.
Innovationskraft ist einer der wesentlichen Erfolgsfaktoren der Zukunft, welcher den Unterschied zwischen erfolgreichen und scheiternden Unternehmen in hohem Maße beeinflussen wird (PWC, 2015). Besonders junge Unternehmen und Start-ups sind für ihre hohe Innovationsfähigkeit bekannt. Etablierte Unternehmen hingegen punkten weniger mit neuen Ideen, aber dafür mit innovationskritischen Ressourcen, Routinen und Skaleneffekten. Ein stetig an Popularität gewinnender Ansatz, die Fähigkeiten und Ressourcen von etablierten Unternehmen mit der Innovationskraft von Start-ups zu verknüpfen, stellt das "Intrapreneurship" dar.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Modern mixed (HTAP)workloads execute fast update-transactions and long running analytical queries on the same dataset and system. In multi-version (MVCC) systems, such workloads result in many short-lived versions and long version-chains as well as in increased and frequent maintenance overhead.
Consequently, the index pressure increases significantly. Firstly, the frequent modifications cause frequent creation of new versions, yielding a surge in index maintenance overhead. Secondly and more importantly, index-scans incur extra I/O overhead to determine, which of the resulting tuple versions are visible to the executing transaction (visibility-check) as current designs only store version/timestamp information in the base table – not in the index. Such index-only visibility-check is critical for HTAP workloads on large datasets.
In this paper we propose the Multi Version Partitioned B-Tree (MV-PBT) as a version-aware index structure, supporting index-only visibility checks and flash-friendly I/O patterns. The experimental evaluation indicates a 2x improvement for analytical queries and 15% higher transactional throughput under HTAP workloads. MV-PBT offers 40% higher tx. throughput compared to WiredTiger’s LSM-Tree implementation under YCSB.
Despite strong political efforts in Europe, industrial small- and medium sized enterprises (SMEs) seem to neglect adopting practices for energy effciency. By taking a cultural perspective, this study investigated what drives the establishment of energy effciency and corresponding practices in SMEs. Based on 10 ethnographic case studies and a quantitative survey among 500 manufacturing SMEs, the results indicate the importance of everyday employee behavior in achieving energy savings. The studied enterprises value behavior related measures as similarly important as technical measures. Raising awareness for energy issues within the organization, therefore, constitutes an essential leadership task that is oftentimes perceived as challenging and frustrating. It was concluded that the embedding of energy efficiency in corporate strategy, the use of a broad spectrum of different practices, and the empowerment and involvement of employees serve as major drivers in establishing energy effciency within SMEs. Moreover, the findings reveal institutional influences on shaping the meanings of energy effciency for the SMEs by raising attention for energy effciency in the enterprises and making energy effciency decisions more likely. The main contribution of the paper is to offer an alternative perspective on energy effciency in SMEs beyond the mere adoption of energy-effcient technology.
The automation of work by means of disruptive technologies such as Artificial Intelligence (AI) and Robotic Process Automation (RPA) is currently intensely discussed in business practice and academia. Recent studies indicate that many tasks manually conducted by humans today will not in the future. In a similar vein, it is expected that new roles will emerge. The aim of this study is to analyze prospective employment opportunities in the context of RPA in order to foster our understanding of the pivotal qualifications, expertise and skills necessary to find an occupation in a completely changing world of work. This study is based on an explorative, content analysis of 119 job advertisements related to RPA in Germany. The data was collected from major German online job platforms, qualitatively coded, and subsequently analyzed quantitatively. The research indicates that there indeed are employment opportunities, especially in the consulting sector. The positions require different technological expertise such as specific programming languages and knowledge in statistics. The results of this study provide guidance for organizations and individuals on reskilling requirements for future employment. As many of the positions require profound IT expertise, the generally accepted perspective that existing employees affected by automation can be retrained to work in the emerging positions has to be seen extremely critical. This paper contributes to the body of knowledge by providing a novel perspective on the ongoing discussion of employment opportunities, and reskilling demands of the existing workforce in the context of recent technological developments and automation.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
In recent years, machine learning algorithms have made a huge development in performance and applicability in industry and especially maintenance. Their application enables predictive maintenance and thus offers efficiency increases. However, a successful implementation of such solutions still requires high effort in data preparation to obtain the right information, interdisciplinarity in teams as well as a good communication to employees. Here, small and medium sized enterprises (SME) often lack in experience, competence and capacity. This paper presents a systematic and practice-oriented method for an implementation of machine learning solutions for predictive maintenance in SME, which has already been validated.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
Driven by digital transformation, manufacturing systems are heading towards autonomy. The implementation of autonomous elements in manufacturing systems is still a big challenge. Especially small and medium sized enterprises (SME) often lack experience to assess the degree of Autonomous Production. Therefore, a description model for the assessment of stages for Autonomous Production has been identified as a core element to support such a transformation process. In contrast to existing models, the developed SME-tailored model comprises different levels within a manufacturing system, from single manufacturing cells to the factory level. Furthermore, the model has been validated in several case studies.
Matthias Varga von Kibéd und Insa Sparrer unterscheiden zwischen drei verschiedenen Aufstellungsmethoden (Sparrer und Varga von Kibéd, o. D.): Der spezifisch (konkreten), der virtuellen und der prototypischen Aufstellung. Bei spezifischen Aufstellungen wird ein konkretes Anliegen eines Klienten betrachtet. Im Gegensatz dazu, werden bei virtuellen Aufstellungen eine Übungsumgebung geschaffen. In dieser können Aufstellungstechniken und Interventionsmethoden geübt werden. Bei prototypischen Strukturaufstellungen werden Themen zusammengefasst, die mehrere Teilnehmer im Seminar berühren bzw. in deren Alltag immer wieder auftreten können. Dieses Thema wird wie eine spezifische Aufstellung bearbeitet jedoch ohne ein konkretes vorliegendes Anliegen. Beispiele für prototypische Strukturaufstellungen kommen aus vielen Bereichen z.B. dem Führungsalltag, Teamentwicklung, Konfliktmanagement, Gesprächsführung, Zeit‐ und Selbstmanagement.