Refine
Year of publication
- 2020 (144) (remove)
Document Type
- Journal article (144) (remove)
Is part of the Bibliography
- yes (144)
Institute
- ESB Business School (67)
- Life Sciences (41)
- Informatik (24)
- Technik (12)
Publisher
- Elsevier (29)
- De Gruyter (13)
- MDPI (13)
- Springer (9)
- Wiley (8)
- MIM, Marken-Institut München (5)
- Sage Publishing (3)
- Taylor & Francis (3)
- Tech Science Press (3)
- Thexis Verlag (3)
Artificial Intelligence-based Assistants AIAs are spreading quickly both in homes and offices. They already have left their original habitats of "intelligent speakers" providing easy access to music collections. The initiated a multitude of new devices and are already populating devices such as TV sets. Characteristic for the intelligent digital assistants is the formation of platforms around their core functionality. Thus, AIS capabilities of the assistants are used to offer new services and create new interfaces for business processes. There are positive network effects between the assistants and the services as well as within the services. Therefore, many companies see the need to get involved in the field of digital assistants but lack a framework to align their initiatives with their corporate strategies. In order to lay the foundation for a comprehensive method, we are therefore investigating intelligent digital assistants. Based on this analysis, we are developing a framework of strategic opportunities and challenges.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Intraoperative brain deformation, so called brain shift, affects the applicability of preoperative magnetic resonance imaging (MRI) data to assist the procedures of intraoperative ultrasound (iUS) guidance during neurosurgery. This paper proposes a deep learning-based approach for fast and accurate deformable registration of preoperative MRI to iUS images to correct brain shift. Based on the architecture of 3D convolutional neural networks, the proposed deep MRI-iUS registration method has been successfully tested and evaluated on the retrospective evaluation of cerebral tumors (RESECT) dataset. This study showed that our proposed method outperforms other registration methods in previous studies with an average mean squared error (MSE) of 85. Moreover, this method can register three 3D MRI-US pair in less than a second, improving the expected outcomes of brain surgery.
Purpose: Gliomas are the most common and aggressive type of brain tumors due to their infiltrative nature and rapid progression. The process of distinguishing tumor boundaries from healthy cells is still a challenging task in the clinical routine. Fluid attenuated inversion recovery (FLAIR) MRI modality can provide the physician with information about tumor infiltration. Therefore, this paper proposes a new generic deep learning architecture, namely DeepSeg, for fully automated detection and segmentation of the brain lesion using FLAIR MRI data.
Methods: The developed DeepSeg is a modular decoupling framework. It consists of two connected core parts based on an encoding and decoding relationship. The encoder part is a convolutional neural network (CNN) responsible for spatial information extraction. The resulting semantic map is inserted into the decoder part to get the full-resolution probability map. Based on modified U-Net architecture, different CNN models such as residual neural network (ResNet), dense convolutional network (DenseNet), and NASNet have been utilized in this study.
Results: The proposed deep learning architectures have been successfully tested and evaluated on-line based on MRI datasets of brain tumor segmentation (BraTS 2019) challenge, including s336 cases as training data and 125 cases for validation data. The dice and Hausdorff distance scores of obtained segmentation results are about 0.81 to 0.84 and 9.8 to 19.7 correspondingly.
Conclusion: This study showed successful feasibility and comparative performance of applying different deep learning models in a new DeepSeg framework for automated brain tumor segmentation in FLAIR MR images. The proposed DeepSeg is open source and freely available at https://github.com/razeineldin/DeepSeg/.
Here, we report the mechanical and water sorption properties of a green composite based on Typha latifolia fibres. The composite was prepared either completely binder-less or bonded with 10% (w/w) of a bio-based resin which was a mixture of an epoxidized linseed oil and a tall-oil based polyamide. The flexural modulus of elasticity, the flexural strength and the water absorption of hot pressed Typha panels were measured and the influence of pressing time and panel density on these properties was investigated. The cure kinetics of the biobased resin was analyzed by differential scanning calorimetry (DSC) in combination with the iso-conversional kinetic analysis method of Vyazovkin to derive the curing conditions required for achieving completely cured resin. For the binderless Typha panels the best technological properties were achieved for panels with high density. By adding 10% of the binder resin the flexural strength and especially the water absorption were improved significantly.
In recent years, machine learning algorithms have made a huge development in performance and applicability in industry and especially maintenance. Their application enables predictive maintenance and thus offers efficiency increases. However, a successful implementation of such solutions still requires high effort in data preparation to obtain the right information, interdisciplinarity in teams as well as a good communication to employees. Here, small and medium sized enterprises (SME) often lack in experience, competence and capacity. This paper presents a systematic and practice-oriented method for an implementation of machine learning solutions for predictive maintenance in SME, which has already been validated.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
This study investigates how integrated reporting (IR) creates value for investors. It examines how providers of financial capital benefit from an improved firm information environment provided by IR. Specifically, this study investigates the effect of voluntary IR disclosure on analyst earnings forecast accuracy as well as on firm value. To do so, we use an international sample of 167 listed companies that voluntarily publish an integrated report. Our analysis shows no significant effect of a voluntary IR publication on analyst earnings forecast accuracy and no significant effect on firm value. We thus do not find evidence for the fulfillment of IR's promises regarding improved information environment and value creation of voluntary adopters. We conclude that such companies might already have a relatively high level of transparency leading to an absent additional effect of IR disclosure. Positive effects of IR appear to be more relevant in environments where IR is mandatory.
Dieser Beitrag analysiert die Reform der IFRS und US-GAAP-Standards zur Bilanzierung von Leasingverhältnissen. Am Beispiel der McKesson Europe AG werden die Auswirkungen der erstmaligen Anwendung der Standards beim Leasingnehmer veranschaulicht. Von besonderem Interesse ist dabei ein Vergleich der Bilanzierungsmodelle nach den "alten" Standards IAS 17 und ASC 840 bzw. nach den "neuen" Standards IFRS 16 und ASC 842. Im Ergebnis zeigt sich keine vollständige Übereinstimmung von IFRS und US-GAAP. Vor allem beim Ausweis in der Gewinn- und Verlustrechnung ergeben sich Unterschiede, die sich auch auf die Ergebniskennzahlen auswirken.
nKV in action: accelerating KVstores on native computational storage with NearData processing
(2020)
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, has yet to see widespread use.
In this paper we demonstrate various NDP alternatives in nKV, which is a key/value store utilizing native computational storage and near-data processing. We showcase the execution of classical operations (GET, SCAN) and complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4x-2.7x better performance due to NDP. nKV runs on real hardware - the COSMOS+ platform.
The data presented in this article characterize the thermomechanical and microhardness properties of a novel melamine-formaldehyde resin (MF) intended for the use as a self-healing surface coating. The investigated MF resin is able to undergo reversible crosslinking via Diels Alder reactive groups. The microhardness data were obtained from nanoindentation measurements performed on solid resin film samples at different stages of the self-healing cycle. Thermomechanical analysis was performed under dynamic load conditions. The data provide supplemental material to the manuscript published by Urdl et al. 2020 (https://doi.org/10.1016/j.eurpolymj.2020.109601) on the self-healing performance of this resin, where a more thorough discussion on the preparation, the properties of this coating material and its application in impregnated paper-based decorative laminates can be found.
The self-healing effect of melamine-based surfaces, triggered by temperature, was investigated. The temperature triggered reversible healing chemistry, on which the self-healing effect is based, was the Diels-Alder (DA) reaction between furan and malemeide groups. Melamine-furan containing building blocks were connected by multi-functional maleimide crosslinker via a Diels-Alder (DA) reaction to giva a DA adduct. The DA adduct was then reacted with formaldehyde to form a network by conventional condensation reaction of melamine amino groups with formaldehyde. The obtained resin was characterised and used for the impregnation of paper. Impregnated papers and neat resin werde used to perform scratch-healing tests and mechanical analysis of the novel coating system.
Thermoplastic polymers like ethylene-octene copolymer (EOC) may be grafted with silanes via reactive extrusion to enable subsequent crosslinking for advanced biomaterials manufacture. However, this reactive extrusion process is difficult to control and it is still challenging to reproducibly arrive at well-defined products. Moreover, high grafting degrees require a considerable excess of grafting reagent. A large proportion of the silane passes through the process without reacting and needs to be removed at great expense by subsequent purification. This results in unnecessarily high consumption of chemicals and a rather resource-inefficient process. It is thus desired to be able to define desired grafting degrees with optimum grafting efficiency by means of suitable process control. In this study, the continuous grafting of vinyltrimethoxysilane (VTMS) on ethylene-octene copolymer (EOC) via reactive extrusion was investigated. Successful grafting was verified and quantified by 1H-NMR spectroscopy. The effects of five process parameters and their synergistic interactions on grafting degree and grafting efficiency were determined using a face-centered experimental design (FCD). Response surface methodology (RSM) was applied to derive a causal process model and define process windows yielding arbitrary grafting degrees between <2 and >5% at a minimum waste of grafting agent. It was found that the reactive extrusion process was strongly influenced by several second-order interaction effects making this process difficult to control. Grafting efficiencies between 75 and 80% can be realized as long as grafting degrees <2% are admitted.
Der Anspruch an Energieversorger wird wachsen: in Zukunft gewinnen vor allem Aufgaben wie die Entwicklung digitalisierter Produkte/Dienstleistungen sowie ökologische Aktivitäten an Relevanz. Dies zeigt die Hochschule Reutlingen in ihrer aktuellen Untersuchung unter Aufsichtsräten, Geschäftsführern und Führungskräften. Trotz der erwarteten Veränderungen: die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalisierung bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet. Besonders relevant dabei: die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. So die Studie des Reutlinger Energiezentrums and der Hochschule Reutlingen im Auftrag von fünf Unternehmen der Branche.
This article adopts a qualitative comparative causal mapping approach to extend knowledge of the interrelated barriers to public entrepreneurship and the outcomes of such entrepreneurship. The results highlight marked differences between the sales segment and the distribution grid segment of German public enterprises that should prompt a refined perspective on public entrepreneurship. Notably, besides intra-organizational barriers and those interfering from the external environment, results also show that a public enterprise’s supervisory board can hinder its progress. This study thus contributes to recent discussion on governance and entrepreneurship by revealing a feature that could distinguish public from private enterprises.
Impregnated paper-based decorative laminates prepared from lignin-substituted phenolic resins
(2020)
High Pressure Laminates (HPL) panels consist of stacks of self-gluing paper sheets soaked with phenol-formaldehyde (PF) resins. An important requirement for such PFs is that they must rapidly penetrate and saturate the paper pores. Partially substituting phenol with bio-based phenolic chemicals like lignin changes the physico-chemical properties of the resin and affects its ability to penetrate the paper. In this study, PF formulations containing different proportions of lignosulfonate and kraft lignin were used to prepare paper-based laminates. The penetration of a Kraft paper sheet was characterized by a recently introduced, new device measuring the conductivity between both sides of the paper sheet after a drop of resin was placed on the surface and allowed to penetrate the sheet. The main target value measured was the time required for a specific resin to completely penetrate the defined paper sample (“penetration time”). This penetration time generally depends on the molecular weight distribution, the flow behavior and the polarity of the resin which in turn are dependent on the manufacturing conditions of the resin. In the present study, the influences of the three process factors: (1) type of lignin material used for substitution, (2) lignin modification by phenolation and (3) degree of phenol substitution on the penetration times of various lignin-phenolic hybrid impregnation resins were studied using a complete twolevel three-factorial experimental design. Thin laminates made with the resins diluted in methanol were mechanically tested in terms of tensile and flexural strains, and their cross-sections were studied by light microscopy.
Here, the effects of substituting portions of fossil-based phenol in phenol formaldehyde resin by renewable lignin from two different sources are investigated using a factorial screening experimental design. Among the resins consumed by the wood-based industry, phenolics are one of the most important types used for impregnation, coating or gluing purposes. They are prepared by condensing phenol with formaldehyde (PF). One major use of PF is as matrix polymer for decorative laminates in exterior cladding and wet-room applications. Important requirements for such PFs are favorable flow properties (low viscosity), rapid curing behavior (high reactivity) and sufficient self-adhesion capacity (high residual curing potential). Partially substituting phenol in PF with bio-based phenolic co-reagents like lignin modifies the physicochemical properties of the resulting resin. In this study, phenol-formaldehyde formulations were synthesized where either 30% or 50% (in weight) of the phenol monomer were substituted by either sodium lignosulfonate or Kraft lignin. The effect of modifying the lignin material by phenolation before incorporation into the resin synthesis was also investigated. The resins so obtained were characterized by Fourier Transform Infra-Red (FTIR) spectroscopy, Size Exclusion Chromatography (SEC), Differential Scanning Calorimetry (DSC), rheology, and measurements of contact angle and surface tension using the Wilhelmy plate method and drop shape analysis.
In modernen Arbeitswelten werden zunehmend arbeitsplatzbezogene digitale Technologien eingesetzt. Wenngleich dies zahlreiche Chancen bietet, kann es auch negative Folgen für die Gesundheit von Mitarbeitenden haben. Diese Herausforderungen werden durch die aktuelle Corona-Krise für viele Unternehmen noch verschärft. Stress, der direkt oder indirekt durch den Einsatz von Technologien entsteht, wird als «Technostress» bezeichnet. Wichtige Hebel zu dessen Vermeidung umfassen die Gestaltung von Technologien sowie die Berücksichtigung verschiedener individueller und situativer Faktoren im Rahmen technologischer Veränderungsprozesse.
Hardly any software development process is used as prescribed by authors or standards. Regardless of company size or industry sector, a majority of project teams and companies use hybrid development methods (short: hybrid methods) that combine different development methods and practices. Even though such hybrid methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this article, we make a first step towards a statistical construction procedure for hybrid methods. Grounded in 1467 data points from a large‐scale practitioner survey, we study the question: What are hybrid methods made of and how can they be systematically constructed? Our findings show that only eight methods and few practices build the core of modern software development. Using an 85% agreement level in the participants' selections, we provide examples illustrating how hybrid methods can be characterized by the practices they are made of. Furthermore, using this characterization, we develop an initial construction procedure, which allows for defining a method frame and enriching it incrementally to devise a hybrid method using ranked sets of practice.
A methodology for designing planar spiral antennas with a feeding network embedded within a dielectric is presented. To avoid a purely academic work which may not be manufactured with available standard technologies, the approach takes into account manufacturing process requirements by choice of used materials in the simulation. General design rules are provided. They encompass amongst others, selection criteria for dielectric material, aspects to consider when sketching the radiating element design, as well as those for the implementation of the feeding network. A rule of thumb, which maybe helpful in the determination of the antenna supporting substrate’s height, has been found. The appeal of the method resides in the fact that it eases up the design process and helps to minimize errors, saving time and money. The approach also enables the design of a compact and small-size spiral antenna as antenna-in-package (AiP), and provides the opportunity to assemble the antenna with other RF components/systems on the same layer stack or on the same integration platform.
Purpose: Despite growing interest in the intersection of supply chain management (SCM) and management accounting (MA) in the academic debate, there is a lack of understanding regarding both the content and the delimitation of this topic. As of today, no common conceptualization of supply chain management accounting (SCMA) exists. The purpose of this study is to provide an overview of the research foci of SCMA in the scholarly debate of the past two decades. Additionally, it analyzes whether and to what extent the academic discourse of MA in SCs has already found its way into both SCM and MA higher education, respectively.
Design/methodology/approach: A content analysis is conducted including 114 higher education textbooks written in English or in German language.
Findings: The study finds that SC-specific concepts of MA are seldom covered in current textbooks of both disciplines. The authors conclude that although there is an extensive body of scholarly research about SCMA concepts, there is a significant discrepancy with what is taught in higher education textbooks.
Practical implications: There is a large discrepancy between the extensive knowledge available in scholarly research and what we teach in both disciplines. This implies that graduates of both disciplines lack important knowledge and skills in controlling and accounting for SCs. To bring about the necessary change, MA and SCM in higher education must be more integrative.
Originality/value: To the best of the authors knowledge, this study is first of its kind comprising a large textbook sample in both English and German languages. It is the first substantiated assessment of the current state of integration between SCM and MA in higher education.
Problem: Immer mehr Unternehmen führen Lean-Prinzipien ein, finden ihre Anforderungen an passende Kosteninformation aber von der traditionellen Kostenrechnung nicht ausreichend abgedeckt.
Ziel: Eine am Lean-Gedanken orientierte Kostenrechnung baut neue Kostenzurechnungsobjekte ein und stellt bisher vernachlässigte Kosteninformationen zur Verfügung
Methode: Gängige Kostenrechnungsansätze werden einem geschlossenen “accounting for lean” Ansatz gegenübergestellt, Gemeinsamkeiten und Überschneidungen aufgezeigt.
Fehler, Manipulation und Rationalität – wie das Reporting das Verhalten der Entscheider beeinflusst
(2020)
Der Zweck des Management Reporting besteht darin, den Informationsbedarf der Führungskräfte zu befriedigen. Sowohl Ersteller als auch Nutzer von Berichten handeln aber nur begrenzt rational. Berichte wirken deshalb nicht „zielgenau“, sondern lösen vielfältige nicht gewünschte Reaktionen bei den Beteiligten aus. In diesem Beitrag erfahren Sie, wie sich „der Faktor Mensch“ auf die Erstellung und Nutzung von Management Reports auswirkt und wie ein effektives und effizientes Management Reporting unerwünschte Wirkungen minimieren kann.
Businesses need to cope with myriad challenges including increasingly competitive markets and rapid developments in digital technology. The overall aim of the research described in this paper is to generate fresh insights into the impacts of digitalisation on the design and management of global supply chains. It focuses on understanding the current adoption rate of new technologies in global supply chains, identifying perceived opportunities and challenges and clarifying the critical factors driving (and inhibiting) their deployment. The authors administered an online survey with a global sample of respondents from various supply chain functions, resulting in a sample of 142 responses. Significant differences emerged in adoption patterns between companies of different sizes. Moreover, the study pointed to a widening gap (or a ‘digital divide’) between leaders and laggards in terms of technology adoption. Perceived benefits and challenges also differ notably between companies of varying sizes. Adoption patterns are very diverse across specific technologies. The results further suggest that there is a significant correlation between adoption of digital technologies and different dimensions of company performance.
With the continuous development of economy, consumers pay more attention to the demand for personalization clothing. However, the recommendation quality of the existing clothing recommendation system is not enough to meet the user’s needs. When browsing online clothing, facial expression is the salient information to understand the user’s preference. In this paper, we propose a novel method to automatically personalize clothing recommendation based on user emotional analysis. Firstly, the facial expression is classified by multiclass SVM. Next, the user’s multi-interest value is calculated using expression intensity that is obtained by hybrid RCNN. Finally, the multi-interest value is fused to carry out personalized recommendation. The experimental results show that the proposed method achieves a significant improvement over other algorithms.
Driven by digital transformation, manufacturing systems are heading towards autonomy. The implementation of autonomous elements in manufacturing systems is still a big challenge. Especially small and medium sized enterprises (SME) often lack experience to assess the degree of Autonomous Production. Therefore, a description model for the assessment of stages for Autonomous Production has been identified as a core element to support such a transformation process. In contrast to existing models, the developed SME-tailored model comprises different levels within a manufacturing system, from single manufacturing cells to the factory level. Furthermore, the model has been validated in several case studies.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
Participation in fast fashion brands’ clothes recycling plans in an omnichannel retail environment
(2020)
The rise of the fast fashion industry allows more and more people to participate in fashion consumption, but goes along with negative consequences on the environment. To reduce wastage, fast fashion retailers have begun to offer used clothes recycling plans to which customers can submit clothes they no longer wear. Since these recycling plans have mainly been operated in offline stores so far, the rise of omnichannel retailing poses new challenges on retailers with regard to organizing the plan and motivating consumers to participate. On a sample of N=370 Chinese fast fashion consumers, this paper investigates, which factors determine consumers’ willingness to participate in fast fashion brands’ used clothes recycling plans in an omnichannel retailing environment. It finds that consumers’ clothes recycling intention is determined by individual predispositions (environmental attitude, impulsive consumption), as well as by organizational arrangements (channel integration quality), as well as by the outcomes of their interaction (consumer satisfaction, brand identification). Conclusions are drawn, implications for omnichannel fast fashion retailing practice, as well as for further research, derived, and limitations discussed.
Melamine-formaldehyde resins are widely used for decorative paper impregnation. Resin properties relevant for impregnation are mainly determined already at the stage of resin synthesis by the applied reaction conditions. Thus, understanding the relationship between reaction conditions and technological properties is important. Response surface methodology based on orthogonal parameter level variations is the most suitable tool to identify and quantify factor effects and deduce causal correlation patterns. Here, two major process factors of MF resin synthesis were systematically varied using such a statistical experimental design. To arrive at resins having a broad range of technological properties, initial pH and M:F ratio were varied in a wide range (pH: 7.9–12.1; M:F ratio: 1:1.5–1:4.5). The impregnation behavior of the resins was modeled using viscosity, penetration rate and residual curing capacity as technological responses. Based on the response surface models, nonlinear and synergistic action of process factors was quantified and a suitable process window for preparing resins with favorable impregnation performance was defined. It was found that low M:F ratios (~1:2–1:2.5) and comparatively high starting pHs (~pH 11) yield impregnation resins with rapid impregnation behavior and good residual curing capacity.
It has been widely shown that biomaterial surface topography can modulate host immune response, but a fundamental understanding of how different topographies contribute to pro-inflammatory or anti-inflammatory responses is still lacking. To investigate the impact of surface topography on immune response, we undertook a systematic approach by analyzing immune response to eight grades of medical grade polyurethane of increasing surface roughness in three in vitro models of the human immune system. Polyurethane specimens were produced with defined roughness values by injection molding according to the VDI 3400 industrial standard. Specimens ranged from 0.1 μm to 18 μm in average roughness (Ra), which was confirmed by confocal scanning microscopy. Immunological responses were assessed with THP-1-derived macrophages, human peripheral blood mononuclear cells (PBMCs), and whole blood following culture on polyurethane specimens. As shown by the release of pro-inflammatory and anti-inflammatory cytokines in all three models, a mild immune response to polyurethane was observed, however, this was not associated with the degree of surface roughness. Likewise, the cell morphology (cell spreading, circularity, and elongation) in THP-1-derived macrophages and the expression of CD molecules in the PBMC model on T cells (HLA-DR and CD16), NK cells (HLA-DR), and monocytes (HLA-DR, CD16, CD86, and CD163) showed no influence of surface roughness. In summary, this study shows that modifying surface roughness in the micrometer range on polyurethane has no impact on the pro-inflammatory immune response. Therefore, we propose that such modifications do not affect the immunocompatibility of polyurethane, thereby supporting the notion of polyurethane as a biocompatible material.
5-hydroxymethyl-furfural (HMF) and furfural are interesting as potential platform chemicals for a bio-based chemical production economy. Within the scope of this work, the process routes under technical development for the production of these platform chemicals were investigated. For two selected processes, the material and energy flows, as well as the carbon footprint, were examined in detail. The possible production process optimizations, further development potentials, and the research demand against the background of the reduction of the primary energy expenditure were worked out.
The dynamic capabilities perspective is aimed at explaining how firms achieve and sustain competitive advantages, especially in environments that become volatile, uncertain, complex, and ambiguous (VUCA). In this paper, we combine factors that explain dynamic capabilities on the firm level with factors of dynamic managerial capabilities on the individual level. In addition to the dynamic capabilities theory, we draw on corporate foresight (CF) literature to test the impact of CF training. We find that both the organizational-level practices and the individual-level training of leaders are positively associated with firm-level outcomes. We further observe that this relationship is mediated by dynamic managerial capabilities (i.e., the ability of leaders to challenge current business models, make decisions under uncertainty, and reconfigure organizational resources). Our findings emphasize the importance of training leaders and building organizational CF practices to build the dynamic capabilities needed in VUCA environments.
The key aim of Open Strategy is to open up the process of strategy development to larger groups within and even outside an organization. Furthermore, Open Strategy aims to include broad groups of stakeholders in the various steps of the strategy process. The question at hand is how can Open Strategy be achieved? What approaches can be used? Scenario planning and business wargaming are approaches perceived as relevant tools in the field of strategy and strategic foresight and in the context of Open Strategy because of their participative nature. The aim of this article is to assess to what degree scenario planning and business wargaming can be used in the context of Open Strategy. While these approaches are suitable, their current application limits the number of potential participants. Further research and experimentation in practice with larger groups and/or online approaches, or a combination of both, are needed to explore the potential of scenario planning and business wargaming as tools for Open Strategy.
Heat pumps are a vital element for reaching the greenhouse gas (GHG) reduction targets in the heating sector, but their system integration requires smart control approaches. In this paper, we first offer a comprehensive literature review and definition of the term control for the described context. Additionally, we present a control approach, which consists of an optimal scheduling module coupled with a detailed energy system simulation module. The aim of this integrated two part control approach is to improve the performance of an energy system equipped with a heat pump, while recognizing the technical boundaries of the energy system in full detail. By applying this control to a typical family household situation, we illustrate that this integrated approach results in a more realistic heat pump operation and thus a more realistic assessment of the control performance, while still achieving lower operational costs.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
Pharmaceutical companies are among the top investors into research and development (R&D) globally, as product innovation is still the main growth driver for the industry and because the related complexities necessitate enormous R&D investments. The market demand for new medicines to be more efficacious or to provide better safety than existing drugs and the regulatory need to prove superiority in clinical trials are reasons why drug R&D is increasingly expensive and pharmaceutical companies need to manage extraordinarily high costs per approved new compound.
We investigated the state of artificial intelligence (AI) in pharmaceutical research and development (R&D) and outline here a risk and reward perspective regarding digital R&D. Given the novelty of the research area, a combined qualitative and quantitative research method was chosen, including the analysis of annual company reports, investor relations information, patent applications, and scientific publications of 21 pharmaceutical companies for the years 2014 to 2019. As a result, we can confirm that the industry is in an ‘early mature’ phase of using AI in R&D. Furthermore, we can demonstrate that, despite the efforts that need to be managed, recent developments in the industry indicate that it is worthwhile to invest to become a ‘digital pharma player’.
Airports largely outgrew their sole purpose of simply being travel hubs and by connecting millions of passengers to their destinations each year on an international scale, they have become increasingly interesting for business and related marketing opportunities. In fact, passengers are easily segmented and can be reached effectively throughout specific airport areas, making some areas more suitable for advertising than others. Emotional states, roaming time and the freedom to move vastly, influence how much information passengers are able to absorb from their direct surroundings. Finally our research shows that some areas are more suitable than others. Therefore a careful selection of airport locations for communication will be key to secure the impact and improve the effectiveness of communication measures. With these insights, advertisers can deliberately choose the areas that are most effective for displaying their ads.
Deutschland, quo vadis?
(2020)
Shutdown in Deutschland im März 2020. Stillstand in Handel und Industrie. Der Börsenwert einer beachtlichen Anzahl von Unternehmen hat sich in kürzester Zeit halbiert. Anleger warfen alles auf den Markt. Und bei der hohen Unsicherheit verloren sämtliche Anlageklassen, zeitweise sogar Gold. Selbst Konzerne wie die Lufthansa werden es ohne Staatshilfe nicht mehr schaffen zu existieren.
Die klassischen Vertriebsaufgaben verändern sich intensiv und schnell. Vertriebsmanager benötigen dringend neue strategische Ansätze, wie sie künftig Kundenkontakte gestalten, Distributionskanäle steuern und effektiver verkaufen können. Eine aktuelle Studie gibt Aufschluss, wie sich Unternehmen auf den Strukturwandel einstellen können.
Das Value-Engineering in der Kundenkommunikation ist eine strukturierte Methode, Kommunikationsprozesse zwischen Unternehmen zu verbessern. Das Konzept greift bewährte Elemente der technischen Wertanalyse und der Gemeinkosten-Wertanalyse auf und überträgt sie auf die Kundenkommunikation. Der Ansatz bietet eine systematische Vorgehensweise, Kommunikationsprozesse zwischen Anbieter und Kunde zu durchleuchten und neu zu gestalten. Value-Engineering in der Kundenkommunikation schafft somit Wettbewerbsvorteile durch eine Optimierung der Kommunikation.
Innovationskraft ist einer der wesentlichen Erfolgsfaktoren der Zukunft, welcher den Unterschied zwischen erfolgreichen und scheiternden Unternehmen in hohem Maße beeinflussen wird (PWC, 2015). Besonders junge Unternehmen und Start-ups sind für ihre hohe Innovationsfähigkeit bekannt. Etablierte Unternehmen hingegen punkten weniger mit neuen Ideen, aber dafür mit innovationskritischen Ressourcen, Routinen und Skaleneffekten. Ein stetig an Popularität gewinnender Ansatz, die Fähigkeiten und Ressourcen von etablierten Unternehmen mit der Innovationskraft von Start-ups zu verknüpfen, stellt das "Intrapreneurship" dar.
Improvement of a three-layered in vitro skin model for topical application of irritating substances
(2020)
In the field of skin tissue engineering, the development of physiologically relevant in vitro skin models comprising all skin layers, namely epidermis, dermis, and subcutis, is a great challenge. Increasing regulatory requirements and the ban on animal experiments for substance testing demand the development of reliable and in vivo-like test systems, which enable high-throughput screening of substances. However, the reproducibility and applicability of in vitro testing has so far been insufficient due to fibroblast-mediated contraction. To overcome this pitfall, an advanced 3-layered skin model was developed. While the epidermis of standard skin models showed an 80% contraction, the initial epidermal area of our advanced skin models was maintained. The improved barrier function of the advanced models was quantified by an indirect barrier function test and a permeability assay. Histochemical and immunofluorescence staining of the advanced model showed well-defined epidermal layers, a dermal part with distributed human dermal fibroblasts and a subcutis with round-shaped adipocytes. The successful response of these advanced 3-layered models for skin irritation testing demonstrated the suitability as an in vitro model for these clinical tests: only the advanced model classified irritative and non-irritative substances correctly. These results indicate that the advanced set up of the 3-layered in vitro skin model maintains skin barrier function and therefore makes them more suitable for irritation testing.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Checklists are a valuable tool to ensure process quality and quality of care. To ensure proper integration in clinical processes, it would be desirable to generate checklists directly from formal process descriptions. Those checklists could also be used for user interaction in context-aware surgical assist systems. We built a tool to automatically convert Business Process Model and Notation (BPMN) process models to checklists displayed as HTML websites. Gateways representing decisions are mapped to checklist items that trigger dynamic content loading based on the placed checkmark. The usability of the resulting system was positively evaluated regarding comprehensibility and end-user friendliness.
Drug-induced liver toxicity is one of the most common reasons for the failure of drugs in clinical trials and frequent withdrawal from the market. Reasons for such failures include the low predictive power of in vivo studies, that is mainly caused by metabolic differences between humans and animals, and intraspecific variances. In addition to factors such as age and genetic background, changes in drug metabolism can also be caused by disease-related changes in the liver. Such metabolic changes have also been observed in clinical settings, for example, in association with a change in liver stiffness, a major characteristic of an altered fibrotic liver. For mimicking these changes in an in vitro model, this study aimed to develop scaffolds that represent the rigidity of healthy and fibrotic liver tissue. We observed that liver cells plated on scaffolds representing the stiffness of healthy livers showed a higher metabolic activity compared to cells plated on stiffer scaffolds. Additionally, we detected a positive effect of a scaffold pre-coated with fetal calf serum (FCS)-containing media. This pre-incubation resulted in increased cell adherence during cell seeding onto the scaffolds. In summary, we developed a scaffold-based 3D model that mimics liver stiffness-dependent changes in drug metabolism that may more easily predict drug interaction in diseased livers.