Refine
Year of publication
- 2020 (312) (remove)
Document Type
- Journal article (144)
- Conference proceeding (93)
- Book chapter (41)
- Book (10)
- Report (9)
- Doctoral Thesis (6)
- Working Paper (4)
- Anthology (3)
- Issue of a journal (2)
Is part of the Bibliography
- yes (312)
Institute
- ESB Business School (105)
- Informatik (101)
- Life Sciences (43)
- Technik (37)
- Texoversum (25)
Publisher
- Springer (47)
- Elsevier (30)
- Hochschule Reutlingen (20)
- IEEE (15)
- MDPI (13)
- Springer Gabler (8)
- ACM (7)
- De Gruyter (6)
- Wiley (6)
- AMD Akademie Mode & Design (5)
A fast way to test business ideas and to explore customer problems and needs is to talk to them. Customer interviews help to understand what solutions customers will pay for before investing valuable resources to develop solutions. Customer interviews are a good way to gain qualitative insights. However, conducting interviews can be a difficult procedure and requires specific skills. The current ways of teaching interview skills have significant deficiencies. They especially lack guidance and opportunities to practice. Objective: The goal of this work is to develop and validate a workshop format to teach interview skills for conducting good customer interviews in a practical manner. Method: The research method is based on design science research which serves as a framework. A game-based workshop format was designed to teach interview skills. The approach consists of a half-day, hands-on workshop and is based on an analysis of necessary interview skills. The approach has been validated in several workshops and improved based on learnings from those workshops. Results: Results of the validation show that participants could significantly improve their interview skills while enjoying the game-based exercises. The game-based learning approach supports learning and practicing customer interview skills with playful and interactive elements that encourage greater motivation among participants to conduct interviews.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
Our paper gives first answers on a fundamental question: how can the design of architectures of intelligent digital systems and services be accomplished methodologically? Intelligent systems and services are the goals of many current digitalization efforts today and part of massive digital transformation efforts based on digital technologies. Digital systems and services are the foundation of digital platforms and ecosystems. Digtalization disrupts existing businesses, technologies, and economies and promotes the architecture of open environments. This has a strong impact on new value-added opportunities and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, and social enterprise networks systems are important enablers of digitalization. The current publication presents our research on the architecture of intelligent digital ecosystems and products and services influenced by the service-dominant logic. We present original methodological extensions and a new reference model for digital architectures with an integral service and value perspective to model intelligent systems and services that effectively align digital strategies and architectures with artificial intelligence as main elements to support intelligent digitalization.
Artificial Intelligence-based Assistants AIAs are spreading quickly both in homes and offices. They already have left their original habitats of "intelligent speakers" providing easy access to music collections. The initiated a multitude of new devices and are already populating devices such as TV sets. Characteristic for the intelligent digital assistants is the formation of platforms around their core functionality. Thus, AIS capabilities of the assistants are used to offer new services and create new interfaces for business processes. There are positive network effects between the assistants and the services as well as within the services. Therefore, many companies see the need to get involved in the field of digital assistants but lack a framework to align their initiatives with their corporate strategies. In order to lay the foundation for a comprehensive method, we are therefore investigating intelligent digital assistants. Based on this analysis, we are developing a framework of strategic opportunities and challenges.
Intelligent systems and services are the strategic targets of many current digitalization efforts and part of massive digital transformations based on digital technologies with artificial intelligence. Digital platform architectures and ecosystems provide an essential base for intelligent digital systems. The paper raises an important question: Which development paths are induced by current innovations in the field of artificial intelligence and digitalization for enterprise architectures? Digitalization disrupts existing enterprises, technologies, and economies and promotes the architecture of cognitive and open intelligent environments. This has a strong impact on new opportunities for value creation and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, service computing, cloud computing, blockchains, big data with analysis, mobile systems, and social business network systems are essential drivers of digitalization. We investigate the development of intelligent digital systems supported by a suitable digital enterprise architecture. We present methodological advances and an evolutionary path for architectures with an integral service and value perspective to enable intelligent systems and services that effectively combine digital strategies and digital architectures with artificial intelligence.
Today, many companies are adapting their strategy, business models, products, services as well as business processes and information systems in order to expand their digitalization level through intelligent systems and services. The paper raises an important question: What are cognitive co-creation mechanisms for extending digital services and architectures to readjust the usage value of smart services? Typically, extensions of digital services and products and their architectures are manual design tasks that are complex and require specialized, rare experts. The current publication explores the basic idea of extending specific digital artifacts, such as intelligent service architectures, through mechanisms of cognitive co-creation to enable a rapid evolutionary path and better integration of humans and intelligent systems. We explore the development of intelligent service architectures through a combined, iterative, and permanent task of co-creation between humans and intelligent systems as part of a new concept of cognitively adapted smart services. In this paper, we present components of a new platform for the joint co-creation of cognitive services for an ecosystem of intelligent services that enables the adaptation of digital services and architectures.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Intraoperative brain deformation, so called brain shift, affects the applicability of preoperative magnetic resonance imaging (MRI) data to assist the procedures of intraoperative ultrasound (iUS) guidance during neurosurgery. This paper proposes a deep learning-based approach for fast and accurate deformable registration of preoperative MRI to iUS images to correct brain shift. Based on the architecture of 3D convolutional neural networks, the proposed deep MRI-iUS registration method has been successfully tested and evaluated on the retrospective evaluation of cerebral tumors (RESECT) dataset. This study showed that our proposed method outperforms other registration methods in previous studies with an average mean squared error (MSE) of 85. Moreover, this method can register three 3D MRI-US pair in less than a second, improving the expected outcomes of brain surgery.
Purpose: Gliomas are the most common and aggressive type of brain tumors due to their infiltrative nature and rapid progression. The process of distinguishing tumor boundaries from healthy cells is still a challenging task in the clinical routine. Fluid attenuated inversion recovery (FLAIR) MRI modality can provide the physician with information about tumor infiltration. Therefore, this paper proposes a new generic deep learning architecture, namely DeepSeg, for fully automated detection and segmentation of the brain lesion using FLAIR MRI data.
Methods: The developed DeepSeg is a modular decoupling framework. It consists of two connected core parts based on an encoding and decoding relationship. The encoder part is a convolutional neural network (CNN) responsible for spatial information extraction. The resulting semantic map is inserted into the decoder part to get the full-resolution probability map. Based on modified U-Net architecture, different CNN models such as residual neural network (ResNet), dense convolutional network (DenseNet), and NASNet have been utilized in this study.
Results: The proposed deep learning architectures have been successfully tested and evaluated on-line based on MRI datasets of brain tumor segmentation (BraTS 2019) challenge, including s336 cases as training data and 125 cases for validation data. The dice and Hausdorff distance scores of obtained segmentation results are about 0.81 to 0.84 and 9.8 to 19.7 correspondingly.
Conclusion: This study showed successful feasibility and comparative performance of applying different deep learning models in a new DeepSeg framework for automated brain tumor segmentation in FLAIR MR images. The proposed DeepSeg is open source and freely available at https://github.com/razeineldin/DeepSeg/.
Here, we report the mechanical and water sorption properties of a green composite based on Typha latifolia fibres. The composite was prepared either completely binder-less or bonded with 10% (w/w) of a bio-based resin which was a mixture of an epoxidized linseed oil and a tall-oil based polyamide. The flexural modulus of elasticity, the flexural strength and the water absorption of hot pressed Typha panels were measured and the influence of pressing time and panel density on these properties was investigated. The cure kinetics of the biobased resin was analyzed by differential scanning calorimetry (DSC) in combination with the iso-conversional kinetic analysis method of Vyazovkin to derive the curing conditions required for achieving completely cured resin. For the binderless Typha panels the best technological properties were achieved for panels with high density. By adding 10% of the binder resin the flexural strength and especially the water absorption were improved significantly.
In recent years, machine learning algorithms have made a huge development in performance and applicability in industry and especially maintenance. Their application enables predictive maintenance and thus offers efficiency increases. However, a successful implementation of such solutions still requires high effort in data preparation to obtain the right information, interdisciplinarity in teams as well as a good communication to employees. Here, small and medium sized enterprises (SME) often lack in experience, competence and capacity. This paper presents a systematic and practice-oriented method for an implementation of machine learning solutions for predictive maintenance in SME, which has already been validated.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Mit den Aufgaben und Fallstudien des Übungsbuchs lassen sich gezielt die zentralen Kapitel und Themen des Lehrbuchs wiederholen, vertiefen und auf Problemstellungen der Unternehmnespraxis übertraggen. Die detaillierten Lösungshinweise zu allen Aufgabenstellungen sind durch zahlreiche Abbildungen und Tabellen unmittelbar nachvollziehbar. Damit ermöglicht das Übungsbuch insbesondere eine optimale Prüfungsvorbereitung.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
This study investigates how integrated reporting (IR) creates value for investors. It examines how providers of financial capital benefit from an improved firm information environment provided by IR. Specifically, this study investigates the effect of voluntary IR disclosure on analyst earnings forecast accuracy as well as on firm value. To do so, we use an international sample of 167 listed companies that voluntarily publish an integrated report. Our analysis shows no significant effect of a voluntary IR publication on analyst earnings forecast accuracy and no significant effect on firm value. We thus do not find evidence for the fulfillment of IR's promises regarding improved information environment and value creation of voluntary adopters. We conclude that such companies might already have a relatively high level of transparency leading to an absent additional effect of IR disclosure. Positive effects of IR appear to be more relevant in environments where IR is mandatory.
Dieser Beitrag analysiert die Reform der IFRS und US-GAAP-Standards zur Bilanzierung von Leasingverhältnissen. Am Beispiel der McKesson Europe AG werden die Auswirkungen der erstmaligen Anwendung der Standards beim Leasingnehmer veranschaulicht. Von besonderem Interesse ist dabei ein Vergleich der Bilanzierungsmodelle nach den "alten" Standards IAS 17 und ASC 840 bzw. nach den "neuen" Standards IFRS 16 und ASC 842. Im Ergebnis zeigt sich keine vollständige Übereinstimmung von IFRS und US-GAAP. Vor allem beim Ausweis in der Gewinn- und Verlustrechnung ergeben sich Unterschiede, die sich auch auf die Ergebniskennzahlen auswirken.
This book presents an empirical investigation of the efforts that multinational pharmaceutical companies take in order to find a business model that allows for a profitable access to the Bottom of the Pyramid (BoP) markets. The Bottom of the Pyramid in Africa is frequently mentioned as an attractive market due to its sheer size. Yet most companies struggle to access it because of the low price level, difficult physical market access and challenges when it comes to payment.
More specifically, the book investigates the following business model-related questions: Do pharmaceutical companies provide products that meet the needs of the BoP? What characterizes the value generation of the company? What revenue model leads to a profitable business, and what role does a network of partners play in the business model?
Findings reveal that there is no ‘one-size-fits-all’ answer to these questions. Providing continuous availability, affordability at a good quality of goods and services, creating health awareness, as well as localizing business to achieve a level of inclusivenessare essential prerequisites for success. In the last chapter this book provides a business model prototype that accounts for these key success factors for business at the Bottom of the Pyramid and points to further research topics.
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become viable.
The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under NoFTL-KV and the COSMOS hardware platform.
nKV in action: accelerating KVstores on native computational storage with NearData processing
(2020)
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, has yet to see widespread use.
In this paper we demonstrate various NDP alternatives in nKV, which is a key/value store utilizing native computational storage and near-data processing. We showcase the execution of classical operations (GET, SCAN) and complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4x-2.7x better performance due to NDP. nKV runs on real hardware - the COSMOS+ platform.
Massive data transfers in modern key/value stores resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, have yet to see widespread use.
In this paper we introduce nKV, which is a key/value store utilizing native computational storage and near-data processing. On the one hand, nKV can directly control the data and computation placement on the underlying storage hardware. On the other hand, nKV propagates the data formats and layouts to the storage device where, software and hardware parsers and accessors are implemented. Both allow NDP operations to execute in host-intervention-free manner, directly on physical addresses and thus better utilize the underlying hardware. Our performance evaluation is based on executing traditional KV operations (GET, SCAN) and on complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4×-2.7× better performance on real hardware – the COSMOS+ platform.
The field of breath analysis has developed to be of growing interest in medical diagnosis and patient monitoring. The main advantages are that it’s noninvasive, painless and repeatable in flexible cycles. Even though breath analysis is being researched for a couple of decades there are still many unanswered questions. Human breath contains volatile organic compounds which are emitted from inside the body. Some of these compounds can be assigned to specific sources, such as inflammation or cancer, but also to non health related origins. This paper gives an overview of breath analysis for the purpose of disease diagnosis and health monitoring. Therefore, literature regarding breath analysis in the medical field has been analyzed, from its early stages to the present. As a result, this paper gives an outline of the topic of breath analysis.
Ein nicht unerheblicher Anteil der Autounfälle ist auf Müdigkeit am Steuer zurückzuführen. Um Unfälle aufgrund von Müdigkeit zu vermeiden, existieren schon einige Ansätze wie beispielsweise die Erkennung der Fahrweise. Im Rahmen des IOT-Labors des Masterstudiengangs Human Centered Computing der Hochschule Reutlingen sollen verschiedene Fahrassistenzsysteme entwickelt und getestet werden, um Unfälle aufgrund von Müdigkeit zu verhindern. Diese Arbeit beschäftigt sich mit der Müdigkeitserkennung über Computer Vision (CV) und das Elektrokardiogramm (EKG). Im Rahmen dieses Papers wird die Müdigkeitserkennung über CV am Steuer mittels den Open Source Bibliotheken OpenCV und Dlib und dem Embedded PC Nvidia Jetson Nano verwirklicht. Die Müdigkeit über EKG wird über den Herzschlag und die Herzfrequenzvariabilität erkannt. Ebenfalls wurde in dieser Arbeit eine Schnittstelle aus CV und EKG entwickelt, um aus den Python-Skripten der Müdigkeitserkennung über Computer Vision und der Müdigkeitserkennung über EKG die zur Erkennung wichtigen Daten zusammenzufassen. Diese werden anschließend zu einem gesamten Ergebnis ausgewertet.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
The data presented in this article characterize the thermomechanical and microhardness properties of a novel melamine-formaldehyde resin (MF) intended for the use as a self-healing surface coating. The investigated MF resin is able to undergo reversible crosslinking via Diels Alder reactive groups. The microhardness data were obtained from nanoindentation measurements performed on solid resin film samples at different stages of the self-healing cycle. Thermomechanical analysis was performed under dynamic load conditions. The data provide supplemental material to the manuscript published by Urdl et al. 2020 (https://doi.org/10.1016/j.eurpolymj.2020.109601) on the self-healing performance of this resin, where a more thorough discussion on the preparation, the properties of this coating material and its application in impregnated paper-based decorative laminates can be found.
The self-healing effect of melamine-based surfaces, triggered by temperature, was investigated. The temperature triggered reversible healing chemistry, on which the self-healing effect is based, was the Diels-Alder (DA) reaction between furan and malemeide groups. Melamine-furan containing building blocks were connected by multi-functional maleimide crosslinker via a Diels-Alder (DA) reaction to giva a DA adduct. The DA adduct was then reacted with formaldehyde to form a network by conventional condensation reaction of melamine amino groups with formaldehyde. The obtained resin was characterised and used for the impregnation of paper. Impregnated papers and neat resin werde used to perform scratch-healing tests and mechanical analysis of the novel coating system.
Thermoplastic polymers like ethylene-octene copolymer (EOC) may be grafted with silanes via reactive extrusion to enable subsequent crosslinking for advanced biomaterials manufacture. However, this reactive extrusion process is difficult to control and it is still challenging to reproducibly arrive at well-defined products. Moreover, high grafting degrees require a considerable excess of grafting reagent. A large proportion of the silane passes through the process without reacting and needs to be removed at great expense by subsequent purification. This results in unnecessarily high consumption of chemicals and a rather resource-inefficient process. It is thus desired to be able to define desired grafting degrees with optimum grafting efficiency by means of suitable process control. In this study, the continuous grafting of vinyltrimethoxysilane (VTMS) on ethylene-octene copolymer (EOC) via reactive extrusion was investigated. Successful grafting was verified and quantified by 1H-NMR spectroscopy. The effects of five process parameters and their synergistic interactions on grafting degree and grafting efficiency were determined using a face-centered experimental design (FCD). Response surface methodology (RSM) was applied to derive a causal process model and define process windows yielding arbitrary grafting degrees between <2 and >5% at a minimum waste of grafting agent. It was found that the reactive extrusion process was strongly influenced by several second-order interaction effects making this process difficult to control. Grafting efficiencies between 75 and 80% can be realized as long as grafting degrees <2% are admitted.
Due to decreased mobility or families living apart, older adults are especially vulnerable to the issue of social isolation. Literature suggests that technology can help to prevent this isolation. The present work addresses an approach to participate in society by sharing knowledge that is cherished. We propose the cooking recipe exchange application PrecRec for older adults to make them feel precious and valued. PrecRec has been developed and evaluated in an iterative process with eleven older adults. The results show that a broad perspective has to be taken into account when designing such systems.
It is essential for the success of a company to set a strategic direction in which a product offering will be developed over time to achieve the company vision. For this reason, roadmaps are used in practice. in general, roadmaps can be expressed in various forms such as technology roadmaps, product roadmaps or industry roadmaps. From the point of view of industry, the basic purpose of a roadmap is to explore, visualize and communicate the dynamic linkage between markets, products and technology.
Nowadays companies are facing increasing market dynamics, rapidly evolving technologies and shifting user expectations. Together with the adoption of lean and agile practices this situation makes it increasingly difficult to plan and predict upfront which products, services or features should be developed in the future. Consequently, many organizations are struggling with their ability to provide reliable and stable product roadmaps by applying traditional approaches. This paper aims at identifying and getting a better understanding of which measures companies have taken to transform their current product roadmapping practices to the requirements of a dynamic and uncertain market environment. This also includes challenges and success factors within this transformation process as well as measures that companies have planned for the future. We conducted 18 semi-structured expert interviews with practitioners of different companies and performed a thematic data analysis. The study shows that the participating companies are aware that the transformation of traditional product roadmapping practices to fulfill the requirements of a dynamic and uncertain market environment is necessary. The most important measures that the participating companies have taken are 1) adequate item planning concerning the timeline, 2) the replacement of a fixed time-based chart by a more flexible structure, 3) the use of outcomes to determine the items (such as features) on the a roadmap, 4) the creation of a central roadmap which allows deriving different representation for each stakeholder and department.
Context: A product roadmap is an important tool in product development. It sets the strategic direction in which the product is to be developed to achieve the company’s vision. However, for product roadmaps to be successful, it is essential that all stakeholders agree with the company’s vision and objectives and are aligned and committed to a common product plan.
Objective: In order to gain a better understanding of product roadmap alignment, this paper aims at identifying measures, activities and techniques in order to align the different stakeholders around the product roadmap.
Method: We conducted a grey literature review according the guidelines to Garousi et al.
Results: Several approaches to gain alignment were identified such as defining and communicating clear objectives based on the product vision, conducting cross-functional workshops, shuttle diplomacy, and mission briefing. In addition, our review identified the “Behavioural Change Stairway Model” that suggests five steps to gain alignment by building empathy and a trustful relationship.
Der Anspruch an Energieversorger wird wachsen: in Zukunft gewinnen vor allem Aufgaben wie die Entwicklung digitalisierter Produkte/Dienstleistungen sowie ökologische Aktivitäten an Relevanz. Dies zeigt die Hochschule Reutlingen in ihrer aktuellen Untersuchung unter Aufsichtsräten, Geschäftsführern und Führungskräften. Trotz der erwarteten Veränderungen: die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalisierung bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet. Besonders relevant dabei: die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. So die Studie des Reutlinger Energiezentrums and der Hochschule Reutlingen im Auftrag von fünf Unternehmen der Branche.
Der Anspruch an Energieversorger wird wachsen. In Zukunft gewinnen Aufgaben wie die Entwicklung digitalisierter Produkte und Dienstleistungen sowie ökologische Aktivitäten an Bedeutung. Dies zeigt die vorliegende Studie unter Aufsichtsräten, Geschäftsführern und Führungskräften in der Energiewirtschaft. Trotz der erwarteten Veränderungen: Die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalität bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet zu sein. Besonders relevant dabei: Die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. Aus Systemsicht bedarf es hierzu auch eines wirkungsvollen Zielsystems für das Unternehmen und hoher Führungswirksamkeit der Geschäftsführung, z. B. also der klaren Kommunikation der Unternehmensstrategie. Weiterhin sind mögliche Wahrnehmungsunterschiede im unternehmerischen Führungsstil der Geschäftsführung zu beachten und auszuräumen, da dies einen wichtigen Katalysator für unternehmerisches Denken und Handeln der Führungskräfte und Mitarbeiter darstellen kann.
This article adopts a qualitative comparative causal mapping approach to extend knowledge of the interrelated barriers to public entrepreneurship and the outcomes of such entrepreneurship. The results highlight marked differences between the sales segment and the distribution grid segment of German public enterprises that should prompt a refined perspective on public entrepreneurship. Notably, besides intra-organizational barriers and those interfering from the external environment, results also show that a public enterprise’s supervisory board can hinder its progress. This study thus contributes to recent discussion on governance and entrepreneurship by revealing a feature that could distinguish public from private enterprises.
Der Erfolg des Resales in der Modebranche wird vor allem durch das starke Wachstum verdeutlicht, denn im Vergleich zum Retail wuchs dieser im vergangenen Jahr 24 Mal schneller. Eine aktuell aufstrebende Form des Verkaufs, Resale, bezeichnet den Prozess, den Produkte durchlaufen, wenn diese ein zweites Mal verkauft, das heißt aus zweiter Hand wiederverkauft werden. Retail hingegen beschreibt den traditionellen Verkauf von Produkten über den (stationären) Einzelhandel. Es kehren also immer mehr Produkte, welche bereits im Besitz eines anderen gewesen sind, in den Handel zurück und stehen erneut zum Verkauf bereit. Womit diese Aufwärtsentwicklung in der Modebranche ermittelt werden kann und inwiefern der Resale auf den Retail trifft, wird im Folgenden beschrieben.
Impregnated paper-based decorative laminates prepared from lignin-substituted phenolic resins
(2020)
High Pressure Laminates (HPL) panels consist of stacks of self-gluing paper sheets soaked with phenol-formaldehyde (PF) resins. An important requirement for such PFs is that they must rapidly penetrate and saturate the paper pores. Partially substituting phenol with bio-based phenolic chemicals like lignin changes the physico-chemical properties of the resin and affects its ability to penetrate the paper. In this study, PF formulations containing different proportions of lignosulfonate and kraft lignin were used to prepare paper-based laminates. The penetration of a Kraft paper sheet was characterized by a recently introduced, new device measuring the conductivity between both sides of the paper sheet after a drop of resin was placed on the surface and allowed to penetrate the sheet. The main target value measured was the time required for a specific resin to completely penetrate the defined paper sample (“penetration time”). This penetration time generally depends on the molecular weight distribution, the flow behavior and the polarity of the resin which in turn are dependent on the manufacturing conditions of the resin. In the present study, the influences of the three process factors: (1) type of lignin material used for substitution, (2) lignin modification by phenolation and (3) degree of phenol substitution on the penetration times of various lignin-phenolic hybrid impregnation resins were studied using a complete twolevel three-factorial experimental design. Thin laminates made with the resins diluted in methanol were mechanically tested in terms of tensile and flexural strains, and their cross-sections were studied by light microscopy.
Here, the effects of substituting portions of fossil-based phenol in phenol formaldehyde resin by renewable lignin from two different sources are investigated using a factorial screening experimental design. Among the resins consumed by the wood-based industry, phenolics are one of the most important types used for impregnation, coating or gluing purposes. They are prepared by condensing phenol with formaldehyde (PF). One major use of PF is as matrix polymer for decorative laminates in exterior cladding and wet-room applications. Important requirements for such PFs are favorable flow properties (low viscosity), rapid curing behavior (high reactivity) and sufficient self-adhesion capacity (high residual curing potential). Partially substituting phenol in PF with bio-based phenolic co-reagents like lignin modifies the physicochemical properties of the resulting resin. In this study, phenol-formaldehyde formulations were synthesized where either 30% or 50% (in weight) of the phenol monomer were substituted by either sodium lignosulfonate or Kraft lignin. The effect of modifying the lignin material by phenolation before incorporation into the resin synthesis was also investigated. The resins so obtained were characterized by Fourier Transform Infra-Red (FTIR) spectroscopy, Size Exclusion Chromatography (SEC), Differential Scanning Calorimetry (DSC), rheology, and measurements of contact angle and surface tension using the Wilhelmy plate method and drop shape analysis.
In digital transformierten Arbeitswelten organisieren die Mitarbeitenden ihre Arbeitszeit, ihren Arbeitsort und die Art und Weise, wie sie Aufgaben erledigen, in größerem Umfang selbst. Unternehmen, die im Zuge des Transformationsprozesses den Grad der Selbstorganisation erhöhen möchten, stehen vor einer komplexen Herausforderung. Selbstorganisation betrifft zahlreiche Elemente der Organisation wie Arbeitsaufgaben und Rollen, Führung, Regeln und Kompetenzen. Auf Basis eines empirisch entwickelten Bezugsrahmens, dem Digitalisierungsatlas, können die verschiedenen Elemente integrativ betrachtet und die Wechselwirkungen zwischen den Dimensionen in den Blick genommen werden. Wird Selbstorganisation ausgehend von der Autonomie der Beschäftigten, Arbeitsaufgaben und die eigene Rolle in der Organisation selbst zu beeinflussen, in den Blick genommen, sind insbesondere die Wechselwirkungen zwischen den organisationalen Dimensionen sowie Führung relevant. Die Spannungen zwischen diesen Dimensionen werden näher fokussiert. Insgesamt zeigt der Beitrag auf, dass Selbstorganisation nicht als ein unabhängiges Phänomen verstanden werden kann, sondern stets in Wechselwirkung mit anderen Dimensionen steht.
In modernen Arbeitswelten werden zunehmend arbeitsplatzbezogene digitale Technologien eingesetzt. Wenngleich dies zahlreiche Chancen bietet, kann es auch negative Folgen für die Gesundheit von Mitarbeitenden haben. Diese Herausforderungen werden durch die aktuelle Corona-Krise für viele Unternehmen noch verschärft. Stress, der direkt oder indirekt durch den Einsatz von Technologien entsteht, wird als «Technostress» bezeichnet. Wichtige Hebel zu dessen Vermeidung umfassen die Gestaltung von Technologien sowie die Berücksichtigung verschiedener individueller und situativer Faktoren im Rahmen technologischer Veränderungsprozesse.
Facial expressions play a dominant role in facilitating social interactions. We endeavor to develop tactile displays to reinstate facial expression modulated communication. The high spatial and temporal dimensionality of facial movements poses a unique challenge when designing tactile encodings of them. A further challenge is developing encodings that are at-tuned to the perceptual characteristics of our skin. A caveat of using vibrotactile displays is that tactile stimuli have been shown to induce perceptual tactile aftereffects when used on the fingers, arm and face. However, at present, despite the prevalence of waist-worn tactile displays, no such investigations of tactile aftereffects at the waist region exist in the literature, though they are warranted by the unique sensory and perceptual signalling characteristics of this area. Using an adaptation paradigm we investigated the presence of perceptual tactile aftereffects induced by continuous and burst vibrotactile stimuli delivered at the navel, side and spinal regions of the waist. We report evidence that the tactile perception topology of the waist is non-uniform, and specifically that the navel and spine regions are resistant to adaptive aftereffects while side regions are more prone to perceptual adaptations to continuous but not burst stimulations. Results of our current investigations highlight the unique set of challenges posed by designing waist-worn tactile displays. These and future perceptual studies can directly inform more realistic and effective implementations of complex high-dimensional spatiotemporal social cues.
Im Rahmen des Forschungsprojektes wurden Ausrüstungsmittel und -verfahren entwickelt, die dem vorbeugenden Schutz von Textilien (insbesondere Bodenbelägen) vor Anschmutzung dienen. Das Verfahren sieht eine kombinierte Ausrüstung von Textilien mit fluorierten Polymeren mit inkorporierten Nanopartikeln (in erster Linie: SiO2) zur Erhöhung der Rauhigkeit vor. Es wurden kommerziell erhältliche Hydrophobiermittel (Fluorcarbon- oder Kohlenwasserstoff-basierte Polymere) in Kombination mit SiO2-Nanopartikeln auf Teppiche aufgebracht und hinsichtlich eines Anschmutzens - z.B. durch Kaffee, KoolAid, Rotwein, AATCC Standard Soil, schwarze Schuhcreme - untersucht. Hierzu wurden die scherempfindlichen Dispersionen der Hydrophobiermittel mit neu entwickelten angepassten Dispersionen von SiO2-Nanopartikel versetzt. Die SiO2-Nanopartikel wurden mit systematisch variierten Größen von 10-1.000 nm synthetisiert, umfassend charakterisiert und mit Hilfe von neu entwickelten Fluormethacrylat-Copolymeren mit reaktiven Gruppen (Maleinsäure-, Itaconsäure- oder Citraconsäureanhydrid) und hydrophilen Modifiern (Alkohol- oder Amingruppen) stabilisiert. Die resultierenden Polymer-Teilchen-Dispersionen konnten aus wässrigen oder ethanolisch-wässrigen Lösungen auf Textilien (PA-, PES- oder WO-Teppiche und -Gewebe) appliziert werden. Weiterhin wurden auch die neu entwickelten Fluorcarbon-Polymere hinsichtlich ihrer Anwendung getestet. In Anschmutzungsversuchen wiesen die so ausgerüsteten Teppiche ein geringeres Anschmutzen durch Standardschmutz als Referenzmaterialien auf. Die Beständigkeit der Ausrüstung bei mechanischer Belastung konnte durch Vernetzung der Polymere auf dem Textilmaterial verbessert werden. Für PA 6- und PA 6.6-Teppiche wurden die besten Ergebnisse hinsichtlich eines geringeren Anschmutzens durch wasserlösliche Verschmutzungen (Kaffee, Rotwein, KoolAid) im Vergleich zu unbehandelten Teppichen ermittelt, wenn die Ausrüstung mit Fluorpolymer-stabilisierten SiO2-Nanopartikeln oder mit einer kombinierten Dispersion aus SiO2-Partikeln und Fluorcarbonharzen vorgenommen wurde. Eine im Vergleich zu unbehandelten Teppichen weniger starke Anschmutzung durch AATCC Standard Soil (DIN EN ISO 11378-2) wurde für mit SiO2-Partikeln behandelte PA 6-Teppiche ermittelt. Hydrophobe Anschmutzungen (z.B. schwarze Schuhcreme) konnten von mit Fluorcarbon-Polymeren ausgerüsteten Teppichen am besten entfernt werden. Die Kombination von SiO2-Partikeln mit Fluorcarbon-Polymeren erwies sich meist als günstiger als die alleinige Behandlung mit Fluorcarbonharzen. Ein Zusammenhang zwischen der Größe der Nanopartikel, der Abrasionsbeständigkeit und den Reinigungseigenschaften wurde festgestellt, und es konnte gezeigt werden, dass FC-Nanopartikel-Composites diese verbessern. Die mechanische Beständigkeit der Antischmutzausrüstung mit SiO2-Nanopartikeln und Fluorcarbon-Polymeren auf Polyamidteppichen wurde z.B. durch Hexapod-Trommelbeanspruchung (nach ISO 10361) geprüft. Durch REM, IR-Spektroskopie und den Wassertropfentest wurde nach 4.000 und auch nach 12.000 Touren noch eine intakte Beschichtung nachgewiesen. Mit Vernetzern, die das Polymer selbst, das Polymer mit Partikeln und/oder der Substratoberfläche vernetzen, konnte z.T. die Abrasionsbeständigkeit verbessert werden (hier müssen ggf. optimalere Vernetzer gesucht werden).
Are textile structures better? In the professional world, there is no doubt that textile composites can offer many advantages. It is well known that they are often better than non-textile alternatives. There are manifold examples. Innovative developments are not only the popular textile reinforced concrete which was awarded with the Deutscher Zukunftspreis (German Future Award) but also a huge number of probably less perceived or spectacular products based on fiber-reinforced plastics.
Textil im Verbund ist besser? Das ist in der Fachwelt lange keine Frage mehr, textile Verbundwerkstoffe können viele Vorteile bieten. Es ist wohl bekannt, dass sie oft besser sind als nicht-textile Alternativen. Die Beispiele sind mannigfaltig. Innovative Entwicklungen sind nicht nur der stark beachtete Textilbeton, der mit dem Deutschen Zukunftspreis ausgezeichnet wurde, sondern auch viele vielleicht weniger wahrgenommene oder spektakuläre Produkte auf Basis faserverstärkter Kunststoffe.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Hardly any software development process is used as prescribed by authors or standards. Regardless of company size or industry sector, a majority of project teams and companies use hybrid development methods (short: hybrid methods) that combine different development methods and practices. Even though such hybrid methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this article, we make a first step towards a statistical construction procedure for hybrid methods. Grounded in 1467 data points from a large‐scale practitioner survey, we study the question: What are hybrid methods made of and how can they be systematically constructed? Our findings show that only eight methods and few practices build the core of modern software development. Using an 85% agreement level in the participants' selections, we provide examples illustrating how hybrid methods can be characterized by the practices they are made of. Furthermore, using this characterization, we develop an initial construction procedure, which allows for defining a method frame and enriching it incrementally to devise a hybrid method using ranked sets of practice.
A methodology for designing planar spiral antennas with a feeding network embedded within a dielectric is presented. To avoid a purely academic work which may not be manufactured with available standard technologies, the approach takes into account manufacturing process requirements by choice of used materials in the simulation. General design rules are provided. They encompass amongst others, selection criteria for dielectric material, aspects to consider when sketching the radiating element design, as well as those for the implementation of the feeding network. A rule of thumb, which maybe helpful in the determination of the antenna supporting substrate’s height, has been found. The appeal of the method resides in the fact that it eases up the design process and helps to minimize errors, saving time and money. The approach also enables the design of a compact and small-size spiral antenna as antenna-in-package (AiP), and provides the opportunity to assemble the antenna with other RF components/systems on the same layer stack or on the same integration platform.
Companies compete more and more as integrated supply chains rather than as individual firms. The success of the entire supply chain determines the economic well-being of the individual company. With management attention shifting to supply chains, the role of management accounting naturally must extend to the cross-company layer as well. This book demonstrates how management accounting can make a significant contribution to supply chain success.It targets students who are already familiar with the fundamentals of accounting and now want to extend their expertise in the field of cross company (or network) management accounting. Practitioners will draw valuable insights from the text as well.
Purpose: Despite growing interest in the intersection of supply chain management (SCM) and management accounting (MA) in the academic debate, there is a lack of understanding regarding both the content and the delimitation of this topic. As of today, no common conceptualization of supply chain management accounting (SCMA) exists. The purpose of this study is to provide an overview of the research foci of SCMA in the scholarly debate of the past two decades. Additionally, it analyzes whether and to what extent the academic discourse of MA in SCs has already found its way into both SCM and MA higher education, respectively.
Design/methodology/approach: A content analysis is conducted including 114 higher education textbooks written in English or in German language.
Findings: The study finds that SC-specific concepts of MA are seldom covered in current textbooks of both disciplines. The authors conclude that although there is an extensive body of scholarly research about SCMA concepts, there is a significant discrepancy with what is taught in higher education textbooks.
Practical implications: There is a large discrepancy between the extensive knowledge available in scholarly research and what we teach in both disciplines. This implies that graduates of both disciplines lack important knowledge and skills in controlling and accounting for SCs. To bring about the necessary change, MA and SCM in higher education must be more integrative.
Originality/value: To the best of the authors knowledge, this study is first of its kind comprising a large textbook sample in both English and German languages. It is the first substantiated assessment of the current state of integration between SCM and MA in higher education.
Fehler, Manipulation und Rationalität – wie das Reporting das Verhalten der Entscheider beeinflusst
(2020)
Der Zweck des Management Reporting besteht darin, den Informationsbedarf der Führungskräfte zu befriedigen. Sowohl Ersteller als auch Nutzer von Berichten handeln aber nur begrenzt rational. Berichte wirken deshalb nicht „zielgenau“, sondern lösen vielfältige nicht gewünschte Reaktionen bei den Beteiligten aus. In diesem Beitrag erfahren Sie, wie sich „der Faktor Mensch“ auf die Erstellung und Nutzung von Management Reports auswirkt und wie ein effektives und effizientes Management Reporting unerwünschte Wirkungen minimieren kann.
Lean Management hat in viele Unternehmen Einzug gehalten. Lean Konzepte stellen neue Anforderungen an die Art und Struktur der benötigten Kosteninformation, welche von traditionallen Kostenrechnungssystemen nicht unmittelbar erfüllt werden. Vertreter eines „Lean Accounting“ schlagen deshalb teils radikale Änderungen und eine Vereinfachung der Kostenrechnung vor. Der Beitrag diskutiert die Beschränkungen der traditionellen Kostenrechnung bei der Umsetzung von Lean Management und stellt ausgewählte Ansätze eines „Accounting for Lean“ vor. Die Analyse zeigt, dass Ansätze des Lean Accounting zu eng fokussiert sind und die in der Praxis vorhandene Pluralität der Kostenrechnungsfunktionen nicht adäquat abbilden können. Eine radikale Neugestaltung bestehender Kostenrechnungssysteme wird deshalb als unrealistisch und unbegründet verworfen. Der Beitrag entwickelt alternative Vorschläge, wie Konzepte des Lean Managements und die dafür benötigte Kosteninformation in traditionellen Kostenrechnungssystemen integriert werden können.
Problem: Immer mehr Unternehmen führen Lean-Prinzipien ein, finden ihre Anforderungen an passende Kosteninformation aber von der traditionellen Kostenrechnung nicht ausreichend abgedeckt.
Ziel: Eine am Lean-Gedanken orientierte Kostenrechnung baut neue Kostenzurechnungsobjekte ein und stellt bisher vernachlässigte Kosteninformationen zur Verfügung
Methode: Gängige Kostenrechnungsansätze werden einem geschlossenen “accounting for lean” Ansatz gegenübergestellt, Gemeinsamkeiten und Überschneidungen aufgezeigt.
Businesses need to cope with myriad challenges including increasingly competitive markets and rapid developments in digital technology. The overall aim of the research described in this paper is to generate fresh insights into the impacts of digitalisation on the design and management of global supply chains. It focuses on understanding the current adoption rate of new technologies in global supply chains, identifying perceived opportunities and challenges and clarifying the critical factors driving (and inhibiting) their deployment. The authors administered an online survey with a global sample of respondents from various supply chain functions, resulting in a sample of 142 responses. Significant differences emerged in adoption patterns between companies of different sizes. Moreover, the study pointed to a widening gap (or a ‘digital divide’) between leaders and laggards in terms of technology adoption. Perceived benefits and challenges also differ notably between companies of varying sizes. Adoption patterns are very diverse across specific technologies. The results further suggest that there is a significant correlation between adoption of digital technologies and different dimensions of company performance.
With the continuous development of economy, consumers pay more attention to the demand for personalization clothing. However, the recommendation quality of the existing clothing recommendation system is not enough to meet the user’s needs. When browsing online clothing, facial expression is the salient information to understand the user’s preference. In this paper, we propose a novel method to automatically personalize clothing recommendation based on user emotional analysis. Firstly, the facial expression is classified by multiclass SVM. Next, the user’s multi-interest value is calculated using expression intensity that is obtained by hybrid RCNN. Finally, the multi-interest value is fused to carry out personalized recommendation. The experimental results show that the proposed method achieves a significant improvement over other algorithms.
Driven by digital transformation, manufacturing systems are heading towards autonomy. The implementation of autonomous elements in manufacturing systems is still a big challenge. Especially small and medium sized enterprises (SME) often lack experience to assess the degree of Autonomous Production. Therefore, a description model for the assessment of stages for Autonomous Production has been identified as a core element to support such a transformation process. In contrast to existing models, the developed SME-tailored model comprises different levels within a manufacturing system, from single manufacturing cells to the factory level. Furthermore, the model has been validated in several case studies.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
Participation in fast fashion brands’ clothes recycling plans in an omnichannel retail environment
(2020)
The rise of the fast fashion industry allows more and more people to participate in fashion consumption, but goes along with negative consequences on the environment. To reduce wastage, fast fashion retailers have begun to offer used clothes recycling plans to which customers can submit clothes they no longer wear. Since these recycling plans have mainly been operated in offline stores so far, the rise of omnichannel retailing poses new challenges on retailers with regard to organizing the plan and motivating consumers to participate. On a sample of N=370 Chinese fast fashion consumers, this paper investigates, which factors determine consumers’ willingness to participate in fast fashion brands’ used clothes recycling plans in an omnichannel retailing environment. It finds that consumers’ clothes recycling intention is determined by individual predispositions (environmental attitude, impulsive consumption), as well as by organizational arrangements (channel integration quality), as well as by the outcomes of their interaction (consumer satisfaction, brand identification). Conclusions are drawn, implications for omnichannel fast fashion retailing practice, as well as for further research, derived, and limitations discussed.
Melamine-formaldehyde resins are widely used for decorative paper impregnation. Resin properties relevant for impregnation are mainly determined already at the stage of resin synthesis by the applied reaction conditions. Thus, understanding the relationship between reaction conditions and technological properties is important. Response surface methodology based on orthogonal parameter level variations is the most suitable tool to identify and quantify factor effects and deduce causal correlation patterns. Here, two major process factors of MF resin synthesis were systematically varied using such a statistical experimental design. To arrive at resins having a broad range of technological properties, initial pH and M:F ratio were varied in a wide range (pH: 7.9–12.1; M:F ratio: 1:1.5–1:4.5). The impregnation behavior of the resins was modeled using viscosity, penetration rate and residual curing capacity as technological responses. Based on the response surface models, nonlinear and synergistic action of process factors was quantified and a suitable process window for preparing resins with favorable impregnation performance was defined. It was found that low M:F ratios (~1:2–1:2.5) and comparatively high starting pHs (~pH 11) yield impregnation resins with rapid impregnation behavior and good residual curing capacity.
It has been widely shown that biomaterial surface topography can modulate host immune response, but a fundamental understanding of how different topographies contribute to pro-inflammatory or anti-inflammatory responses is still lacking. To investigate the impact of surface topography on immune response, we undertook a systematic approach by analyzing immune response to eight grades of medical grade polyurethane of increasing surface roughness in three in vitro models of the human immune system. Polyurethane specimens were produced with defined roughness values by injection molding according to the VDI 3400 industrial standard. Specimens ranged from 0.1 μm to 18 μm in average roughness (Ra), which was confirmed by confocal scanning microscopy. Immunological responses were assessed with THP-1-derived macrophages, human peripheral blood mononuclear cells (PBMCs), and whole blood following culture on polyurethane specimens. As shown by the release of pro-inflammatory and anti-inflammatory cytokines in all three models, a mild immune response to polyurethane was observed, however, this was not associated with the degree of surface roughness. Likewise, the cell morphology (cell spreading, circularity, and elongation) in THP-1-derived macrophages and the expression of CD molecules in the PBMC model on T cells (HLA-DR and CD16), NK cells (HLA-DR), and monocytes (HLA-DR, CD16, CD86, and CD163) showed no influence of surface roughness. In summary, this study shows that modifying surface roughness in the micrometer range on polyurethane has no impact on the pro-inflammatory immune response. Therefore, we propose that such modifications do not affect the immunocompatibility of polyurethane, thereby supporting the notion of polyurethane as a biocompatible material.
Wie kann die Digitalisierung in der Bauzulieferbranche erfolgreich gemeistert werden? Die Fülle und Komplexität der Fragen dazu lassen sich auf zwei zentrale Kernfragen reduzieren: Was sind die richtigen Inhalte und wesentlichen Werttreiber der Digitalisierung? Und wie muss zukünftig mit der steigenden Informationsflut, der rasant wachsenden Komplexität und der abnehmenden Planbarkeit umgegangen werden?
In diesem Beitrag wird ein Framework vorgestellt, das Bauzulieferern hilft, ihr digitales Zielbild mit seinen Werttreibern systematisch aus dem Kundennutzen abzuleiten. Das Framework berücksichtigt die Besonderheiten der Bauzulieferindustrie, kann aber mit leichten Anpassungen auch auf andere Branchen angewendet werden. Aufbauend auf dem Zielbild können Unternehmen definieren, welche technischen, personellen und organisatorischen Veränderungen für dessen Umsetzung erforderlich sind. Um flexibel mit den dynamischen Veränderungen in ihrem Ökosystem und kulturellen Herausforderungen umgehen zu können, werden zudem fünf Einflussgrößen identifiziert, die Unternehmen bei der Entwicklung der dafür benötigten Evolutionskompetenz berücksichtigen müssen.
5-hydroxymethyl-furfural (HMF) and furfural are interesting as potential platform chemicals for a bio-based chemical production economy. Within the scope of this work, the process routes under technical development for the production of these platform chemicals were investigated. For two selected processes, the material and energy flows, as well as the carbon footprint, were examined in detail. The possible production process optimizations, further development potentials, and the research demand against the background of the reduction of the primary energy expenditure were worked out.
Die Entwicklung eines Medizinproduktes benötigt in der Regel mehrere Jahre. Gesetzliche Vorgaben, wie zum Beispiel das Medizinprodukte Durchführungsgesetz, bestimmen, welche Schritte während der Entwicklung durchgeführt werden müssen. Deren Einhaltung muss in der technischen Dokumentation nachgewiesen werden. Die darin enthaltenen technischen Dokumente entstehen im Verlauf der Entwicklung. Diese bauen aufeinander auf und verweisen sich gegenseitig. Dadurch entstehen heterogene und unübersichtliche Strukturen. Eine Lösung für dieses Problem bietet Traceability. Traceability sorgt dafür, dass die Anforderungen an das Medizinprodukt mit Dokumenten, wie dem Anforderungskatalog, Lastenheft oder der Spezifikation verknüpft werden können. Somit ist jederzeit nachvollziehbar, welche Anforderungen mit welchem Test, welchen Änderungen oder welchen Ergebnissen zusammenhängen. Ein wichtiger Prozess bei der Entwicklung von Medizinprodukten ist zudem das Usability Engineering, wodurch die Sicherheit eines Medizinprodukts sichergestellt und Risiken bei der Anwendung minimiert werden sollen. In diesem Prozess entstehen viele Artefakte, wie zum Beispiel Usability-Berichte. Um den Überblick über alle Usability-Daten behalten zu können, können diese mithilfe von Traceability verknüpft werden. In diesem Artikel wird herausgestellt, welche Voraussetzungen für das Usability Engineering in der Medizintechnik an Traceability gestellt
werden.
Projektmanagement
(2020)
Projektmanagement ist ein Werkzeug um singuläre Aufgaben interdisziplinär und unternehmensübergreifend strukturiert zu bearbeiten, die einmalig und extrem bedeutsam für das Unternehmen sind sowie nicht einfach in der bestehenden Linienorganisation bearbeitet werden können. Unter Projektmanagement versteht man ein Konzept für die Leitung eines komplexen Vorhabens und die Institution, die dieses Vorhaben leitet.
Personalmanagement
(2020)
Auch wenn der Wert in keiner Bilanz auftaucht: das Humankapital entscheidet über den Unternehmenserfolg. Während Kapital im Überfluss vorhanden ist, ist das Personal zunehmend der Engpassfaktor. Wurde bis in die 1980er-Jahre der Mensch als Produktionsfaktor und die Personalabteilung als seine Verwaltungsinstanz gesehen, so ist die Personalarbeit heute ein integratives Element des Managementprozesses und die Personalabteilung aktiver Teil des Managementteams (Scholz 2014c). Damit verbunden ist der begriffliche Wandel von Personalwirtschaft bzw. Personalverwaltung hin zum Personalmanagement bzw. Human Ressource Management (HRM). Die Begriffe signalisieren eine stärker strategisch ausgerichtete Auseinandersetzung mit allen Fragen, die den Einsatz von Personal und die Verknüpfung der Personal- mit der Unternehmensstrategie zum Gegenstand haben.
Wichtige Aufgaben der Personalarbeit sind Personalplanung, Personalbeschaffung, Personalentwicklung, Personaleinsatz, Personalkostenmanagement, Personalführung. Diese werden in der Regel von unterschiedlichen Stellen wahrgenommen – neben der Personalabteilung spielen dabei auch die direkte Führungskraft sowie die Unternehmensleitung eine wichtige Rolle.
The dynamic capabilities perspective is aimed at explaining how firms achieve and sustain competitive advantages, especially in environments that become volatile, uncertain, complex, and ambiguous (VUCA). In this paper, we combine factors that explain dynamic capabilities on the firm level with factors of dynamic managerial capabilities on the individual level. In addition to the dynamic capabilities theory, we draw on corporate foresight (CF) literature to test the impact of CF training. We find that both the organizational-level practices and the individual-level training of leaders are positively associated with firm-level outcomes. We further observe that this relationship is mediated by dynamic managerial capabilities (i.e., the ability of leaders to challenge current business models, make decisions under uncertainty, and reconfigure organizational resources). Our findings emphasize the importance of training leaders and building organizational CF practices to build the dynamic capabilities needed in VUCA environments.
The key aim of Open Strategy is to open up the process of strategy development to larger groups within and even outside an organization. Furthermore, Open Strategy aims to include broad groups of stakeholders in the various steps of the strategy process. The question at hand is how can Open Strategy be achieved? What approaches can be used? Scenario planning and business wargaming are approaches perceived as relevant tools in the field of strategy and strategic foresight and in the context of Open Strategy because of their participative nature. The aim of this article is to assess to what degree scenario planning and business wargaming can be used in the context of Open Strategy. While these approaches are suitable, their current application limits the number of potential participants. Further research and experimentation in practice with larger groups and/or online approaches, or a combination of both, are needed to explore the potential of scenario planning and business wargaming as tools for Open Strategy.
Heat pumps are a vital element for reaching the greenhouse gas (GHG) reduction targets in the heating sector, but their system integration requires smart control approaches. In this paper, we first offer a comprehensive literature review and definition of the term control for the described context. Additionally, we present a control approach, which consists of an optimal scheduling module coupled with a detailed energy system simulation module. The aim of this integrated two part control approach is to improve the performance of an energy system equipped with a heat pump, while recognizing the technical boundaries of the energy system in full detail. By applying this control to a typical family household situation, we illustrate that this integrated approach results in a more realistic heat pump operation and thus a more realistic assessment of the control performance, while still achieving lower operational costs.
Systemische Betrachtung des therapeutischen Roboters Paro im Vergleich zu dem Haustierroboter AIBO
(2020)
Roboter sind in der heutigen Zeit nicht nur in der Industrie zu finden, sondern werden immer häufiger in privaten Lebensbereichen eingesetzt. Ein Beispiel hierfür ist der soziale Therapie-Roboter Paro. Dieser ist dem Verhalten und Aussehen einer jungen Robbe nachempfunden, drückt Gefühle aus und wird besonders in Pflegeheimen eingesetzt. Dabei zeigt er positive Auswirkungen auf das Wohlbefinden pflegebedürftiger Menschen. Diese Arbeit stellt den Roboter Paro in einer systemischen Analyse dar: hierbei werden Systemkontext, Anwendungsfälle, Anforderungen und Struktur betrachtet. Anschließend erfolgt eine Analyse des Haustierroboters AIBO, welcher einem Welpen ähnelt und verstärkt der Unterhaltung von Privatpersonen dient. Es werden Gemeinsamkeiten und Unterschiede zwischen den Systemen herausgearbeitet. Dabei wird ersichtlich, dass beide Systeme dem Nutzer vorrangig Gesellschaft leisten, jedoch verschiedene Anforderungen besitzen und in unterschiedlichen Anwendungsdomänen eingesetzt werden. Zudem besitzt AIBO vielfältigere Fähigkeiten und einen höheren Bewegungsgrad als Paro. Dies spiegelt sich in einer komplexeren Struktur der Hardware wider.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
The planning and control of intralogistics systems in line with versatile production systems of smart factories requires new approaches and methods to cope with changing requirements within future factories. The planning of intralogistics can no longer follow a static, sequential approach as in the past since the planning assumptions are going to change in a high frequency. Reasons for these constant changes are amongst others external turbulences like rapidly changing market conditions, decreasing batch sizes down to customer-specific products with a batch size of one and on the other hand internal turbulences (like production and logistic resource breakdowns) affecting the production system. This paper gives an insight into research approaches and results how capabilities of intelligent logistical objects (intelligent bins, autonomous transport systems etc.) can be used to achieve a self-organized, cost and performance optimized intralogistics system with autonomously controlled process execution within versatile production environments. A first consistent method has been developed which has been validated and implemented within a scenario at the pilot factory Werk150 at the ESB Business School (Reutlingen University). Based on the incoming production orders, the method of the Extended Profitability Appraisal (EPA) covering the work system value to define the most effective work system for order fulfilment is applied. To derive the appropriate intralogistics processes, an autonomous control method involving principles of decentralized and target-oriented decision-making (e.g. intelligent bins are interacting with autonomously controlled transport systems to fulfil material orders of assembly workstations) has been developed and applied to achieve a target-optimized process execution. The results of the first stage research using predefined material sources and sinks described in this paper is going to set the basis for the further development of a self-organized and autonomously controlled method for intralogistics systems considering dynamic source and sink relations. By allowing dynamic shifts of production orders in the sense of dynamic source and sink relations the cost and performance aims of the intralogistics system can be directly aligned with the aims of the entire versatile production system in the sense of self-organized and autonomously controlled systems.
Pharmaceutical companies are among the top investors into research and development (R&D) globally, as product innovation is still the main growth driver for the industry and because the related complexities necessitate enormous R&D investments. The market demand for new medicines to be more efficacious or to provide better safety than existing drugs and the regulatory need to prove superiority in clinical trials are reasons why drug R&D is increasingly expensive and pharmaceutical companies need to manage extraordinarily high costs per approved new compound.
We investigated the state of artificial intelligence (AI) in pharmaceutical research and development (R&D) and outline here a risk and reward perspective regarding digital R&D. Given the novelty of the research area, a combined qualitative and quantitative research method was chosen, including the analysis of annual company reports, investor relations information, patent applications, and scientific publications of 21 pharmaceutical companies for the years 2014 to 2019. As a result, we can confirm that the industry is in an ‘early mature’ phase of using AI in R&D. Furthermore, we can demonstrate that, despite the efforts that need to be managed, recent developments in the industry indicate that it is worthwhile to invest to become a ‘digital pharma player’.
Airports largely outgrew their sole purpose of simply being travel hubs and by connecting millions of passengers to their destinations each year on an international scale, they have become increasingly interesting for business and related marketing opportunities. In fact, passengers are easily segmented and can be reached effectively throughout specific airport areas, making some areas more suitable for advertising than others. Emotional states, roaming time and the freedom to move vastly, influence how much information passengers are able to absorb from their direct surroundings. Finally our research shows that some areas are more suitable than others. Therefore a careful selection of airport locations for communication will be key to secure the impact and improve the effectiveness of communication measures. With these insights, advertisers can deliberately choose the areas that are most effective for displaying their ads.
Deutschland, quo vadis?
(2020)
Shutdown in Deutschland im März 2020. Stillstand in Handel und Industrie. Der Börsenwert einer beachtlichen Anzahl von Unternehmen hat sich in kürzester Zeit halbiert. Anleger warfen alles auf den Markt. Und bei der hohen Unsicherheit verloren sämtliche Anlageklassen, zeitweise sogar Gold. Selbst Konzerne wie die Lufthansa werden es ohne Staatshilfe nicht mehr schaffen zu existieren.
Die klassischen Vertriebsaufgaben verändern sich intensiv und schnell. Vertriebsmanager benötigen dringend neue strategische Ansätze, wie sie künftig Kundenkontakte gestalten, Distributionskanäle steuern und effektiver verkaufen können. Eine aktuelle Studie gibt Aufschluss, wie sich Unternehmen auf den Strukturwandel einstellen können.
Das Value-Engineering in der Kundenkommunikation ist eine strukturierte Methode, Kommunikationsprozesse zwischen Unternehmen zu verbessern. Das Konzept greift bewährte Elemente der technischen Wertanalyse und der Gemeinkosten-Wertanalyse auf und überträgt sie auf die Kundenkommunikation. Der Ansatz bietet eine systematische Vorgehensweise, Kommunikationsprozesse zwischen Anbieter und Kunde zu durchleuchten und neu zu gestalten. Value-Engineering in der Kundenkommunikation schafft somit Wettbewerbsvorteile durch eine Optimierung der Kommunikation.
Innovationskraft ist einer der wesentlichen Erfolgsfaktoren der Zukunft, welcher den Unterschied zwischen erfolgreichen und scheiternden Unternehmen in hohem Maße beeinflussen wird (PWC, 2015). Besonders junge Unternehmen und Start-ups sind für ihre hohe Innovationsfähigkeit bekannt. Etablierte Unternehmen hingegen punkten weniger mit neuen Ideen, aber dafür mit innovationskritischen Ressourcen, Routinen und Skaleneffekten. Ein stetig an Popularität gewinnender Ansatz, die Fähigkeiten und Ressourcen von etablierten Unternehmen mit der Innovationskraft von Start-ups zu verknüpfen, stellt das "Intrapreneurship" dar.
The SDGs give an overview of the world's development challenges of the present and the coming decades and set a new global agenda for more inclusive and sustainable development and growth. These challenges also represent opportunities for social innovations and the creation of scalable and financially self-sustaining solutions by businesses and (social) entrepreneurs. Examples of solutions to social and ecological challenges are for instance providing low-income communities with access to affordable, quality products and services in areas such as water and sanitation, energy, health, education and finance. New business models can meet customer demands by providing solutions and thereby create opportunities for low-income people as employees, suppliers and distributors.
Artificial Intelligence enables innovative applications, and applications based on Artificial Intelligence are increasingly important for all aspects of the Digital Economy. However, the question of how AI resources such as tools and data can be linked to provide an AI-capability and create business value is still open. Therefore, this paper identifies the value-creating mechanisms of connectionist artificial intelligence using a capability-oriented view and points out the connections to different kinds of business value. The analysis supports an agenda that identifies areas that need further research to understand the mechanism of value creation in connectionist artificial intelligence.
AI technologies such as deep learning provide promising advances in many areas. Using these technologies, enterprises and organizations implement new business models and capabilities. In the beginning, AI-technologies have been deployed in an experimental environment. AI-based applications have been created in an ad-hoc manner and without methodological guidance or engineering approach. Due to the increasing importance of AI-technologies, however, a more structured approach is necessary that enable the methodological engineering of AI-based applications. Therefore, we develop in this paper first steps towards methodological engineering of AI-based applications. First, we identify some important differences between the technological foundations of AI- technologies, in particular deep learning, and traditional information technologies. Then we create a framework that enables to engineer AI-applications using four steps: identification of an AI-application type, sub-type identification, lifecycle phase, and definition of details. The introduced framework considers that AI-applications use an inductive approach to infer knowledge from huge collections and streams of data. It not only enables the rapid development of AI-application but also the efficient sharing of knowledge on AI-applications.
Improvement of a three-layered in vitro skin model for topical application of irritating substances
(2020)
In the field of skin tissue engineering, the development of physiologically relevant in vitro skin models comprising all skin layers, namely epidermis, dermis, and subcutis, is a great challenge. Increasing regulatory requirements and the ban on animal experiments for substance testing demand the development of reliable and in vivo-like test systems, which enable high-throughput screening of substances. However, the reproducibility and applicability of in vitro testing has so far been insufficient due to fibroblast-mediated contraction. To overcome this pitfall, an advanced 3-layered skin model was developed. While the epidermis of standard skin models showed an 80% contraction, the initial epidermal area of our advanced skin models was maintained. The improved barrier function of the advanced models was quantified by an indirect barrier function test and a permeability assay. Histochemical and immunofluorescence staining of the advanced model showed well-defined epidermal layers, a dermal part with distributed human dermal fibroblasts and a subcutis with round-shaped adipocytes. The successful response of these advanced 3-layered models for skin irritation testing demonstrated the suitability as an in vitro model for these clinical tests: only the advanced model classified irritative and non-irritative substances correctly. These results indicate that the advanced set up of the 3-layered in vitro skin model maintains skin barrier function and therefore makes them more suitable for irritation testing.
Haptisches Feedback ist nach zahlreichen Studien ein wichtiger Bestandteil in der medizinischen Robotik. Die meisten Systeme befinden sich jedoch noch im Forschungsstatus und verfolgen unterschiedliche Ansätze. In der Teleoperation wird mit sensorlosen und Sensor-Systemen geforscht. Sensoren bieten, im Gegensatz zu den Encodern in sensorlosen Systemen, genaue Messungen, sind allerdings teuer in der Anschaffung, schwer zu desinfizieren und müssen in OP-Besteck integriert werden. In Hands-On Systemen fühlt der Operateur im Gegensatz zu Teleoperationssystemen direkt die auftretenden Kräfte bei der Benutzung. Der Roboter bietet in diesen Systemen nur die benötigte Stabilität und Genauigkeit, gesteuert werden sie direkt durch den Menschen. Dagegen werden in Teleoperationssystemen gezielte Controller eingesetzt. Hier hat sich der für den OP entwickelte sigma.7 durchgesetzt. Gegenüber der für die Allgemeinheit entwickelten Konkurrenz bietet er haptisches Feedback in allen nötigen Freiheitsgraden und eine entsprechende Kraftrückkoppelung.
The automation of work by means of disruptive technologies such as Artificial Intelligence (AI) and Robotic Process Automation (RPA) is currently intensely discussed in business practice and academia. Recent studies indicate that many tasks manually conducted by humans today will not in the future. In a similar vein, it is expected that new roles will emerge. The aim of this study is to analyze prospective employment opportunities in the context of RPA in order to foster our understanding of the pivotal qualifications, expertise and skills necessary to find an occupation in a completely changing world of work. This study is based on an explorative, content analysis of 119 job advertisements related to RPA in Germany. The data was collected from major German online job platforms, qualitatively coded, and subsequently analyzed quantitatively. The research indicates that there indeed are employment opportunities, especially in the consulting sector. The positions require different technological expertise such as specific programming languages and knowledge in statistics. The results of this study provide guidance for organizations and individuals on reskilling requirements for future employment. As many of the positions require profound IT expertise, the generally accepted perspective that existing employees affected by automation can be retrained to work in the emerging positions has to be seen extremely critical. This paper contributes to the body of knowledge by providing a novel perspective on the ongoing discussion of employment opportunities, and reskilling demands of the existing workforce in the context of recent technological developments and automation.
Methods based exclusively on heart rate hardly allow to differentiate between physical activity, stress, relaxation, and rest, that is why an additional sensor like activity/movement sensor added for detection and classification. The response of the heart to physical activity, stress, relaxation, and no activity can be very similar. In this study, we can observe the influence of induced stress and analyze which metrics could be considered for its detection. The changes in the Root Mean Square of the Successive Differences provide us with information about physiological changes. A set of measurements collecting the RR intervals was taken. The intervals are used as a parameter to distinguish four different stages. Parameters like skin conductivity or skin temperature were not used because the main aim is to maintain a minimum number of sensors and devices and thereby to increase the wearability in the future.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Energieversorgungsunternehmen sehen sich gegenwärtig und mit dem Voranschreiten der Energiewende mit einer Vielzahl von Herausforderungen konfrontiert. Die zunehmende Dezentralisierung und Digitalisierung der Energiewirtschaft zwingen die EVU zusammen mit dem steigenden Wettbewerbs- und Kostendruck dazu, von den gewohnten Pfaden des Commodity-Geschäfts abzuweichen und neue Produkte und Dienstleistungen zu entwickeln sowie neue Geschäftsfelder zu erschließen. Die Kunden rücken mit ihren zunehmend individualisierten und komplexen Bedürfnissen in den Mittelpunkt des Interesses. Wollen EVU langfristig am Markt erfolgreich sein, müssen sie sich vom klassischen Energieversorger als Anbieter von einfachen Produkten hin zu innovativen, kundenzentrierten Energiedienstleistern weiterentwickeln. Ein Schlüssel hierfür stellt die Entwicklung von Energiedienstleistungen dar. Gleichzeitig tun sich viele EVU mit dieser Entwicklung jedoch aufgrund interner Hemmnisse wie fehlender personeller Kapazitäten, einer angesichts des dynamischen Marktumfeldes unflexiblen Organisation mit langen Entwicklungsdauern, geringen Budgets sowie mangelndem fachlichen und/ oder methodischen Know-How schwer. Dies zeigen die Ergebnisse der vorliegenden Studie unter Geschäftsführern und Vorständen in der Energiewirtschaft.
Digital technologies are main strategic drivers for digitalization and offer ubiquitous data availability, unlimited connectivity, and massive processing power for a fundamentally changing business. This leads to the development and application of intelligent digital systems. The current state of research and practice of architecting digital systems and services lacks a solid methodological foundation that fully accommodates all requirements linked to efficient and effective development of digital systems in organizations. Research presented in this paper addresses the question, how management of complexity in digital systems and architectures can be supported from a methodological perspective. In this context, the current focus is on a better understanding of the causes of increased complexity and requirements to methodological support. For this purpose, we take an enterprise architecture perspective, i.e. how the introduction of digital systems affects the complexity of EA. Two industrial case studies and a systematic literature analysis result in the proposal of an extended Digital Enterprise Architecture Cube as framework for future methodical support.
Going forward with the requirements of missions to the Moon and further into deep space, the European Space Agency is investigating new methods of astronaut training that can help accelerate learning, increase availability and reduce complexity and cost in comparison to currently used methods. To achieve this, technologies such as virtual reality may be utilized. In this paper, an investigation into the benefits of using virtual reality as a means for extravehicular activity training in comparison to conventional training methods, such as neutral buoyancy pools is given. To help determine the requirements and current uses of virtual reality for extravehicular activity training first hand tests of currently available software as well as expert interviews are utilized. With this knowledge a concept is developed that may be used to further advance training methods in virtual reality. The resulting concept is used as a basis for development of a prototype to showcase user interactions and locomotion in microgravity simulations.
Investigation of tympanic membrane influences on middle-ear impedance measurements and simulations
(2020)
This study simulates acoustic impedance measurements in the human ear canal and investigates error influences due to improperly accounted evanescence in the probe’s near field, cross-section area changes, curvature of the ear canal, and pressure inhomogeneities across the tympanic membrane, which arise mainly at frequencies above 10 kHz. Evanescence results from strongly damped modes of higher order, which can only be found in the near field of the sound source and are excited due to sharp cross-sectional changes as they occur at the transition from the probe loudspeaker to the ear canal. This means that different impedances are measured depending on the probe design. The influence of evanescence cannot be eliminated completely from measurements, however, it can be reduced by a probe design with larger distance between speaker and microphone. A completely different approach to account for the influence of evanescence is to evaluate impedance measurements with the help of a finite element model, which takes the precise arrangement of microphone and speaker in the measurement into account. The latter is shown in this study exemplary on impedance measurements at a tube terminated with a steel plate. Furthermore, the influences of shape changes of the tympanic membrane and ear canal curvature on impedance are investigated.
Checklists are a valuable tool to ensure process quality and quality of care. To ensure proper integration in clinical processes, it would be desirable to generate checklists directly from formal process descriptions. Those checklists could also be used for user interaction in context-aware surgical assist systems. We built a tool to automatically convert Business Process Model and Notation (BPMN) process models to checklists displayed as HTML websites. Gateways representing decisions are mapped to checklist items that trigger dynamic content loading based on the placed checkmark. The usability of the resulting system was positively evaluated regarding comprehensibility and end-user friendliness.
Drug-induced liver toxicity is one of the most common reasons for the failure of drugs in clinical trials and frequent withdrawal from the market. Reasons for such failures include the low predictive power of in vivo studies, that is mainly caused by metabolic differences between humans and animals, and intraspecific variances. In addition to factors such as age and genetic background, changes in drug metabolism can also be caused by disease-related changes in the liver. Such metabolic changes have also been observed in clinical settings, for example, in association with a change in liver stiffness, a major characteristic of an altered fibrotic liver. For mimicking these changes in an in vitro model, this study aimed to develop scaffolds that represent the rigidity of healthy and fibrotic liver tissue. We observed that liver cells plated on scaffolds representing the stiffness of healthy livers showed a higher metabolic activity compared to cells plated on stiffer scaffolds. Additionally, we detected a positive effect of a scaffold pre-coated with fetal calf serum (FCS)-containing media. This pre-incubation resulted in increased cell adherence during cell seeding onto the scaffolds. In summary, we developed a scaffold-based 3D model that mimics liver stiffness-dependent changes in drug metabolism that may more easily predict drug interaction in diseased livers.
Dieser Beitrag gibt einen Überblick über die verschiedenen Möglichkeiten der Bilanzierung einens Initial Coin Offerings (ICO) beim Emittenten auf der Passivseite nach den Regelungen der IFRS. Ziel ist es, die bilanzielle Einordnung anhand verschiedenenr Arten von Token zu erörtern und den Emittenten bei der Ausgestaltung der Token sowie der anschließenden Bilanzierung zu unterstützen. Die Ergebnisse zeigen, dass die Standards für die bilanzielle Einordnung von ICO-Token zwar ausreichen, allerdings eine große Bandbreite der Bilanzierung zu berücksichtigen ist und eine detaillierte Regelung durch einen eigenen IFRS daher schwierig erscheint.
Polyelectrolyte multilayer coatings (PEM) are prepared by alternative layer-by-layer deposition of cationic and anionic polyelectrolyte monolayers on charged surfaces. The thickness of the coatings ranges from nm to few μm. Their properties such as roughness, stiffness, surface charge and surface energy can be precisely tuned to fulfil different technical or biological requirements. The coating process is based on self-assembly of polyelectrolytes. Advantages of these coatings are their easy handling, no harsh chemistry and the possibility for coatings on complex geometries. The PEM coatings can be prepared from a variety of suitable polyelectrolytes. Their stability varies from very durable PEM coatings that are only soluble in strong solvents to quickly degradable, which may be applied as drug release system. One example of such a degradable PEM system is the one based on the polyelectrolyte pair Hyaluronan (HA) and Chitosan (CHI). These biopolymers originate from natural sources and show low toxicity towards human cells. However, HA/CHI multilayers show only weak adhesiveness for human umbilical vein endothelial cells (HUVEC). In this article, we summarize our approaches to enhance the HA/CHI multilayer by incorporation of a non-polymer substance –graphene oxide– to improve the cell adhesion and keep such properties as low cytotoxicity and biodegradability. Different approaches for incorporation of graphene oxide were performed and the cellular adhesion was tested by metabolic assay.
A holistic approach to digitization enables decision-makers to achieve new efficiency in corporate performance management. The digitalization improves the quality, validity and speed of information retrieval and processing. At present, most corporations are confronted with the problem of not being able to organize, categorize and visualize decision-relevant information. To meet the challenges of information management, the Management Cockpit provides an information center for managers. In accordance with the specific working environment of the executives, the Management Cockpit offers a quick and comprehensive overview of the company's situation. Today, the current situation of a company is no longer only influenced by internal factors, but also by its public image. Social media monitoring and analysis is therefore a crucial component for the external factors of successful management. Real-time monitoring of the emotions and behaviors of consumers and customers thus contributes to effective controlling of allbusiness areas. The intelligent factories promise to collect data for internal factors, but the current reality in manufacturing looks different. Production often consists of a large number of different machines, with varying degrees of digitization and limited sensor data availability. In order to close this gap, we developed a compact sensor board with network components, which allows a flexible design with different sensors for a wide variety of applications. The sensor data enable decision makers to adapt the supply chain based on their internal and external observations in the Management Cockpit. Due to the realtime and long-term monitoring and analytic possibilities the Management Cockpit provides a multi-dimensional view of the company and supports an holistic Corporate Performance Management.
The advent of chatbots in customer service solutions received increasing attention by research and practice throughout the last years. However, the relevant dimensions and features for service quality and service performance for chatbots remain quite unclear. Therefore, this research develops and tests a conceptual model for customer service quality and customer service performance in the context of chatbots. Additionally, the impact of the developed service dimensions on different customer relationship metrics is measured across different service channels (hotline versus chatbots). Findings of six independent studies indicate a strong main effect of the conceptualized service dimensions on customer satisfaction, service costs, intention to service reusage, word-of-mouth, and customer loyalty. However, different service dimensions are relevant for chatbots compared to a traditional service hotline.