Refine
Year of publication
- 2020 (312) (remove)
Document Type
- Journal article (144)
- Conference proceeding (93)
- Book chapter (41)
- Book (10)
- Report (9)
- Doctoral Thesis (6)
- Working Paper (4)
- Anthology (3)
- Issue of a journal (2)
Is part of the Bibliography
- yes (312)
Institute
- ESB Business School (105)
- Informatik (101)
- Life Sciences (43)
- Technik (37)
- Texoversum (25)
Publisher
- Springer (58)
- Elsevier (30)
- Hochschule Reutlingen (20)
- IEEE (16)
- De Gruyter (13)
- MDPI (13)
- Association for Computing Machinery (9)
- Wiley (8)
- Gesellschaft für Informatik e.V (6)
- AMD Akademie Mode & Design (5)
A fast way to test business ideas and to explore customer problems and needs is to talk to them. Customer interviews help to understand what solutions customers will pay for before investing valuable resources to develop solutions. Customer interviews are a good way to gain qualitative insights. However, conducting interviews can be a difficult procedure and requires specific skills. The current ways of teaching interview skills have significant deficiencies. They especially lack guidance and opportunities to practice. Objective: The goal of this work is to develop and validate a workshop format to teach interview skills for conducting good customer interviews in a practical manner. Method: The research method is based on design science research which serves as a framework. A game-based workshop format was designed to teach interview skills. The approach consists of a half-day, hands-on workshop and is based on an analysis of necessary interview skills. The approach has been validated in several workshops and improved based on learnings from those workshops. Results: Results of the validation show that participants could significantly improve their interview skills while enjoying the game-based exercises. The game-based learning approach supports learning and practicing customer interview skills with playful and interactive elements that encourage greater motivation among participants to conduct interviews.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
Our paper gives first answers on a fundamental question: how can the design of architectures of intelligent digital systems and services be accomplished methodologically? Intelligent systems and services are the goals of many current digitalization efforts today and part of massive digital transformation efforts based on digital technologies. Digital systems and services are the foundation of digital platforms and ecosystems. Digtalization disrupts existing businesses, technologies, and economies and promotes the architecture of open environments. This has a strong impact on new value-added opportunities and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, and social enterprise networks systems are important enablers of digitalization. The current publication presents our research on the architecture of intelligent digital ecosystems and products and services influenced by the service-dominant logic. We present original methodological extensions and a new reference model for digital architectures with an integral service and value perspective to model intelligent systems and services that effectively align digital strategies and architectures with artificial intelligence as main elements to support intelligent digitalization.
Artificial Intelligence-based Assistants AIAs are spreading quickly both in homes and offices. They already have left their original habitats of "intelligent speakers" providing easy access to music collections. The initiated a multitude of new devices and are already populating devices such as TV sets. Characteristic for the intelligent digital assistants is the formation of platforms around their core functionality. Thus, AIS capabilities of the assistants are used to offer new services and create new interfaces for business processes. There are positive network effects between the assistants and the services as well as within the services. Therefore, many companies see the need to get involved in the field of digital assistants but lack a framework to align their initiatives with their corporate strategies. In order to lay the foundation for a comprehensive method, we are therefore investigating intelligent digital assistants. Based on this analysis, we are developing a framework of strategic opportunities and challenges.
Intelligent systems and services are the strategic targets of many current digitalization efforts and part of massive digital transformations based on digital technologies with artificial intelligence. Digital platform architectures and ecosystems provide an essential base for intelligent digital systems. The paper raises an important question: Which development paths are induced by current innovations in the field of artificial intelligence and digitalization for enterprise architectures? Digitalization disrupts existing enterprises, technologies, and economies and promotes the architecture of cognitive and open intelligent environments. This has a strong impact on new opportunities for value creation and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, service computing, cloud computing, blockchains, big data with analysis, mobile systems, and social business network systems are essential drivers of digitalization. We investigate the development of intelligent digital systems supported by a suitable digital enterprise architecture. We present methodological advances and an evolutionary path for architectures with an integral service and value perspective to enable intelligent systems and services that effectively combine digital strategies and digital architectures with artificial intelligence.
Today, many companies are adapting their strategy, business models, products, services as well as business processes and information systems in order to expand their digitalization level through intelligent systems and services. The paper raises an important question: What are cognitive co-creation mechanisms for extending digital services and architectures to readjust the usage value of smart services? Typically, extensions of digital services and products and their architectures are manual design tasks that are complex and require specialized, rare experts. The current publication explores the basic idea of extending specific digital artifacts, such as intelligent service architectures, through mechanisms of cognitive co-creation to enable a rapid evolutionary path and better integration of humans and intelligent systems. We explore the development of intelligent service architectures through a combined, iterative, and permanent task of co-creation between humans and intelligent systems as part of a new concept of cognitively adapted smart services. In this paper, we present components of a new platform for the joint co-creation of cognitive services for an ecosystem of intelligent services that enables the adaptation of digital services and architectures.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Intraoperative brain deformation, so called brain shift, affects the applicability of preoperative magnetic resonance imaging (MRI) data to assist the procedures of intraoperative ultrasound (iUS) guidance during neurosurgery. This paper proposes a deep learning-based approach for fast and accurate deformable registration of preoperative MRI to iUS images to correct brain shift. Based on the architecture of 3D convolutional neural networks, the proposed deep MRI-iUS registration method has been successfully tested and evaluated on the retrospective evaluation of cerebral tumors (RESECT) dataset. This study showed that our proposed method outperforms other registration methods in previous studies with an average mean squared error (MSE) of 85. Moreover, this method can register three 3D MRI-US pair in less than a second, improving the expected outcomes of brain surgery.
Purpose: Gliomas are the most common and aggressive type of brain tumors due to their infiltrative nature and rapid progression. The process of distinguishing tumor boundaries from healthy cells is still a challenging task in the clinical routine. Fluid attenuated inversion recovery (FLAIR) MRI modality can provide the physician with information about tumor infiltration. Therefore, this paper proposes a new generic deep learning architecture, namely DeepSeg, for fully automated detection and segmentation of the brain lesion using FLAIR MRI data.
Methods: The developed DeepSeg is a modular decoupling framework. It consists of two connected core parts based on an encoding and decoding relationship. The encoder part is a convolutional neural network (CNN) responsible for spatial information extraction. The resulting semantic map is inserted into the decoder part to get the full-resolution probability map. Based on modified U-Net architecture, different CNN models such as residual neural network (ResNet), dense convolutional network (DenseNet), and NASNet have been utilized in this study.
Results: The proposed deep learning architectures have been successfully tested and evaluated on-line based on MRI datasets of brain tumor segmentation (BraTS 2019) challenge, including s336 cases as training data and 125 cases for validation data. The dice and Hausdorff distance scores of obtained segmentation results are about 0.81 to 0.84 and 9.8 to 19.7 correspondingly.
Conclusion: This study showed successful feasibility and comparative performance of applying different deep learning models in a new DeepSeg framework for automated brain tumor segmentation in FLAIR MR images. The proposed DeepSeg is open source and freely available at https://github.com/razeineldin/DeepSeg/.
Here, we report the mechanical and water sorption properties of a green composite based on Typha latifolia fibres. The composite was prepared either completely binder-less or bonded with 10% (w/w) of a bio-based resin which was a mixture of an epoxidized linseed oil and a tall-oil based polyamide. The flexural modulus of elasticity, the flexural strength and the water absorption of hot pressed Typha panels were measured and the influence of pressing time and panel density on these properties was investigated. The cure kinetics of the biobased resin was analyzed by differential scanning calorimetry (DSC) in combination with the iso-conversional kinetic analysis method of Vyazovkin to derive the curing conditions required for achieving completely cured resin. For the binderless Typha panels the best technological properties were achieved for panels with high density. By adding 10% of the binder resin the flexural strength and especially the water absorption were improved significantly.
In recent years, machine learning algorithms have made a huge development in performance and applicability in industry and especially maintenance. Their application enables predictive maintenance and thus offers efficiency increases. However, a successful implementation of such solutions still requires high effort in data preparation to obtain the right information, interdisciplinarity in teams as well as a good communication to employees. Here, small and medium sized enterprises (SME) often lack in experience, competence and capacity. This paper presents a systematic and practice-oriented method for an implementation of machine learning solutions for predictive maintenance in SME, which has already been validated.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Mit den Aufgaben und Fallstudien des Übungsbuchs lassen sich gezielt die zentralen Kapitel und Themen des Lehrbuchs wiederholen, vertiefen und auf Problemstellungen der Unternehmnespraxis übertraggen. Die detaillierten Lösungshinweise zu allen Aufgabenstellungen sind durch zahlreiche Abbildungen und Tabellen unmittelbar nachvollziehbar. Damit ermöglicht das Übungsbuch insbesondere eine optimale Prüfungsvorbereitung.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
This study investigates how integrated reporting (IR) creates value for investors. It examines how providers of financial capital benefit from an improved firm information environment provided by IR. Specifically, this study investigates the effect of voluntary IR disclosure on analyst earnings forecast accuracy as well as on firm value. To do so, we use an international sample of 167 listed companies that voluntarily publish an integrated report. Our analysis shows no significant effect of a voluntary IR publication on analyst earnings forecast accuracy and no significant effect on firm value. We thus do not find evidence for the fulfillment of IR's promises regarding improved information environment and value creation of voluntary adopters. We conclude that such companies might already have a relatively high level of transparency leading to an absent additional effect of IR disclosure. Positive effects of IR appear to be more relevant in environments where IR is mandatory.
Dieser Beitrag analysiert die Reform der IFRS und US-GAAP-Standards zur Bilanzierung von Leasingverhältnissen. Am Beispiel der McKesson Europe AG werden die Auswirkungen der erstmaligen Anwendung der Standards beim Leasingnehmer veranschaulicht. Von besonderem Interesse ist dabei ein Vergleich der Bilanzierungsmodelle nach den "alten" Standards IAS 17 und ASC 840 bzw. nach den "neuen" Standards IFRS 16 und ASC 842. Im Ergebnis zeigt sich keine vollständige Übereinstimmung von IFRS und US-GAAP. Vor allem beim Ausweis in der Gewinn- und Verlustrechnung ergeben sich Unterschiede, die sich auch auf die Ergebniskennzahlen auswirken.
This book presents an empirical investigation of the efforts that multinational pharmaceutical companies take in order to find a business model that allows for a profitable access to the Bottom of the Pyramid (BoP) markets. The Bottom of the Pyramid in Africa is frequently mentioned as an attractive market due to its sheer size. Yet most companies struggle to access it because of the low price level, difficult physical market access and challenges when it comes to payment.
More specifically, the book investigates the following business model-related questions: Do pharmaceutical companies provide products that meet the needs of the BoP? What characterizes the value generation of the company? What revenue model leads to a profitable business, and what role does a network of partners play in the business model?
Findings reveal that there is no ‘one-size-fits-all’ answer to these questions. Providing continuous availability, affordability at a good quality of goods and services, creating health awareness, as well as localizing business to achieve a level of inclusivenessare essential prerequisites for success. In the last chapter this book provides a business model prototype that accounts for these key success factors for business at the Bottom of the Pyramid and points to further research topics.
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become viable.
The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under NoFTL-KV and the COSMOS hardware platform.
nKV in action: accelerating KVstores on native computational storage with NearData processing
(2020)
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, has yet to see widespread use.
In this paper we demonstrate various NDP alternatives in nKV, which is a key/value store utilizing native computational storage and near-data processing. We showcase the execution of classical operations (GET, SCAN) and complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4x-2.7x better performance due to NDP. nKV runs on real hardware - the COSMOS+ platform.
Massive data transfers in modern key/value stores resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, have yet to see widespread use.
In this paper we introduce nKV, which is a key/value store utilizing native computational storage and near-data processing. On the one hand, nKV can directly control the data and computation placement on the underlying storage hardware. On the other hand, nKV propagates the data formats and layouts to the storage device where, software and hardware parsers and accessors are implemented. Both allow NDP operations to execute in host-intervention-free manner, directly on physical addresses and thus better utilize the underlying hardware. Our performance evaluation is based on executing traditional KV operations (GET, SCAN) and on complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4×-2.7× better performance on real hardware – the COSMOS+ platform.
The field of breath analysis has developed to be of growing interest in medical diagnosis and patient monitoring. The main advantages are that it’s noninvasive, painless and repeatable in flexible cycles. Even though breath analysis is being researched for a couple of decades there are still many unanswered questions. Human breath contains volatile organic compounds which are emitted from inside the body. Some of these compounds can be assigned to specific sources, such as inflammation or cancer, but also to non health related origins. This paper gives an overview of breath analysis for the purpose of disease diagnosis and health monitoring. Therefore, literature regarding breath analysis in the medical field has been analyzed, from its early stages to the present. As a result, this paper gives an outline of the topic of breath analysis.
Ein nicht unerheblicher Anteil der Autounfälle ist auf Müdigkeit am Steuer zurückzuführen. Um Unfälle aufgrund von Müdigkeit zu vermeiden, existieren schon einige Ansätze wie beispielsweise die Erkennung der Fahrweise. Im Rahmen des IOT-Labors des Masterstudiengangs Human Centered Computing der Hochschule Reutlingen sollen verschiedene Fahrassistenzsysteme entwickelt und getestet werden, um Unfälle aufgrund von Müdigkeit zu verhindern. Diese Arbeit beschäftigt sich mit der Müdigkeitserkennung über Computer Vision (CV) und das Elektrokardiogramm (EKG). Im Rahmen dieses Papers wird die Müdigkeitserkennung über CV am Steuer mittels den Open Source Bibliotheken OpenCV und Dlib und dem Embedded PC Nvidia Jetson Nano verwirklicht. Die Müdigkeit über EKG wird über den Herzschlag und die Herzfrequenzvariabilität erkannt. Ebenfalls wurde in dieser Arbeit eine Schnittstelle aus CV und EKG entwickelt, um aus den Python-Skripten der Müdigkeitserkennung über Computer Vision und der Müdigkeitserkennung über EKG die zur Erkennung wichtigen Daten zusammenzufassen. Diese werden anschließend zu einem gesamten Ergebnis ausgewertet.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
The data presented in this article characterize the thermomechanical and microhardness properties of a novel melamine-formaldehyde resin (MF) intended for the use as a self-healing surface coating. The investigated MF resin is able to undergo reversible crosslinking via Diels Alder reactive groups. The microhardness data were obtained from nanoindentation measurements performed on solid resin film samples at different stages of the self-healing cycle. Thermomechanical analysis was performed under dynamic load conditions. The data provide supplemental material to the manuscript published by Urdl et al. 2020 (https://doi.org/10.1016/j.eurpolymj.2020.109601) on the self-healing performance of this resin, where a more thorough discussion on the preparation, the properties of this coating material and its application in impregnated paper-based decorative laminates can be found.
The self-healing effect of melamine-based surfaces, triggered by temperature, was investigated. The temperature triggered reversible healing chemistry, on which the self-healing effect is based, was the Diels-Alder (DA) reaction between furan and malemeide groups. Melamine-furan containing building blocks were connected by multi-functional maleimide crosslinker via a Diels-Alder (DA) reaction to giva a DA adduct. The DA adduct was then reacted with formaldehyde to form a network by conventional condensation reaction of melamine amino groups with formaldehyde. The obtained resin was characterised and used for the impregnation of paper. Impregnated papers and neat resin werde used to perform scratch-healing tests and mechanical analysis of the novel coating system.
Thermoplastic polymers like ethylene-octene copolymer (EOC) may be grafted with silanes via reactive extrusion to enable subsequent crosslinking for advanced biomaterials manufacture. However, this reactive extrusion process is difficult to control and it is still challenging to reproducibly arrive at well-defined products. Moreover, high grafting degrees require a considerable excess of grafting reagent. A large proportion of the silane passes through the process without reacting and needs to be removed at great expense by subsequent purification. This results in unnecessarily high consumption of chemicals and a rather resource-inefficient process. It is thus desired to be able to define desired grafting degrees with optimum grafting efficiency by means of suitable process control. In this study, the continuous grafting of vinyltrimethoxysilane (VTMS) on ethylene-octene copolymer (EOC) via reactive extrusion was investigated. Successful grafting was verified and quantified by 1H-NMR spectroscopy. The effects of five process parameters and their synergistic interactions on grafting degree and grafting efficiency were determined using a face-centered experimental design (FCD). Response surface methodology (RSM) was applied to derive a causal process model and define process windows yielding arbitrary grafting degrees between <2 and >5% at a minimum waste of grafting agent. It was found that the reactive extrusion process was strongly influenced by several second-order interaction effects making this process difficult to control. Grafting efficiencies between 75 and 80% can be realized as long as grafting degrees <2% are admitted.
Due to decreased mobility or families living apart, older adults are especially vulnerable to the issue of social isolation. Literature suggests that technology can help to prevent this isolation. The present work addresses an approach to participate in society by sharing knowledge that is cherished. We propose the cooking recipe exchange application PrecRec for older adults to make them feel precious and valued. PrecRec has been developed and evaluated in an iterative process with eleven older adults. The results show that a broad perspective has to be taken into account when designing such systems.
It is essential for the success of a company to set a strategic direction in which a product offering will be developed over time to achieve the company vision. For this reason, roadmaps are used in practice. in general, roadmaps can be expressed in various forms such as technology roadmaps, product roadmaps or industry roadmaps. From the point of view of industry, the basic purpose of a roadmap is to explore, visualize and communicate the dynamic linkage between markets, products and technology.
Nowadays companies are facing increasing market dynamics, rapidly evolving technologies and shifting user expectations. Together with the adoption of lean and agile practices this situation makes it increasingly difficult to plan and predict upfront which products, services or features should be developed in the future. Consequently, many organizations are struggling with their ability to provide reliable and stable product roadmaps by applying traditional approaches. This paper aims at identifying and getting a better understanding of which measures companies have taken to transform their current product roadmapping practices to the requirements of a dynamic and uncertain market environment. This also includes challenges and success factors within this transformation process as well as measures that companies have planned for the future. We conducted 18 semi-structured expert interviews with practitioners of different companies and performed a thematic data analysis. The study shows that the participating companies are aware that the transformation of traditional product roadmapping practices to fulfill the requirements of a dynamic and uncertain market environment is necessary. The most important measures that the participating companies have taken are 1) adequate item planning concerning the timeline, 2) the replacement of a fixed time-based chart by a more flexible structure, 3) the use of outcomes to determine the items (such as features) on the a roadmap, 4) the creation of a central roadmap which allows deriving different representation for each stakeholder and department.
Context: A product roadmap is an important tool in product development. It sets the strategic direction in which the product is to be developed to achieve the company’s vision. However, for product roadmaps to be successful, it is essential that all stakeholders agree with the company’s vision and objectives and are aligned and committed to a common product plan.
Objective: In order to gain a better understanding of product roadmap alignment, this paper aims at identifying measures, activities and techniques in order to align the different stakeholders around the product roadmap.
Method: We conducted a grey literature review according the guidelines to Garousi et al.
Results: Several approaches to gain alignment were identified such as defining and communicating clear objectives based on the product vision, conducting cross-functional workshops, shuttle diplomacy, and mission briefing. In addition, our review identified the “Behavioural Change Stairway Model” that suggests five steps to gain alignment by building empathy and a trustful relationship.
Der Anspruch an Energieversorger wird wachsen: in Zukunft gewinnen vor allem Aufgaben wie die Entwicklung digitalisierter Produkte/Dienstleistungen sowie ökologische Aktivitäten an Relevanz. Dies zeigt die Hochschule Reutlingen in ihrer aktuellen Untersuchung unter Aufsichtsräten, Geschäftsführern und Führungskräften. Trotz der erwarteten Veränderungen: die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalisierung bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet. Besonders relevant dabei: die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. So die Studie des Reutlinger Energiezentrums and der Hochschule Reutlingen im Auftrag von fünf Unternehmen der Branche.
Der Anspruch an Energieversorger wird wachsen. In Zukunft gewinnen Aufgaben wie die Entwicklung digitalisierter Produkte und Dienstleistungen sowie ökologische Aktivitäten an Bedeutung. Dies zeigt die vorliegende Studie unter Aufsichtsräten, Geschäftsführern und Führungskräften in der Energiewirtschaft. Trotz der erwarteten Veränderungen: Die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalität bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet zu sein. Besonders relevant dabei: Die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. Aus Systemsicht bedarf es hierzu auch eines wirkungsvollen Zielsystems für das Unternehmen und hoher Führungswirksamkeit der Geschäftsführung, z. B. also der klaren Kommunikation der Unternehmensstrategie. Weiterhin sind mögliche Wahrnehmungsunterschiede im unternehmerischen Führungsstil der Geschäftsführung zu beachten und auszuräumen, da dies einen wichtigen Katalysator für unternehmerisches Denken und Handeln der Führungskräfte und Mitarbeiter darstellen kann.
This article adopts a qualitative comparative causal mapping approach to extend knowledge of the interrelated barriers to public entrepreneurship and the outcomes of such entrepreneurship. The results highlight marked differences between the sales segment and the distribution grid segment of German public enterprises that should prompt a refined perspective on public entrepreneurship. Notably, besides intra-organizational barriers and those interfering from the external environment, results also show that a public enterprise’s supervisory board can hinder its progress. This study thus contributes to recent discussion on governance and entrepreneurship by revealing a feature that could distinguish public from private enterprises.
Der Erfolg des Resales in der Modebranche wird vor allem durch das starke Wachstum verdeutlicht, denn im Vergleich zum Retail wuchs dieser im vergangenen Jahr 24 Mal schneller. Eine aktuell aufstrebende Form des Verkaufs, Resale, bezeichnet den Prozess, den Produkte durchlaufen, wenn diese ein zweites Mal verkauft, das heißt aus zweiter Hand wiederverkauft werden. Retail hingegen beschreibt den traditionellen Verkauf von Produkten über den (stationären) Einzelhandel. Es kehren also immer mehr Produkte, welche bereits im Besitz eines anderen gewesen sind, in den Handel zurück und stehen erneut zum Verkauf bereit. Womit diese Aufwärtsentwicklung in der Modebranche ermittelt werden kann und inwiefern der Resale auf den Retail trifft, wird im Folgenden beschrieben.
Impregnated paper-based decorative laminates prepared from lignin-substituted phenolic resins
(2020)
High Pressure Laminates (HPL) panels consist of stacks of self-gluing paper sheets soaked with phenol-formaldehyde (PF) resins. An important requirement for such PFs is that they must rapidly penetrate and saturate the paper pores. Partially substituting phenol with bio-based phenolic chemicals like lignin changes the physico-chemical properties of the resin and affects its ability to penetrate the paper. In this study, PF formulations containing different proportions of lignosulfonate and kraft lignin were used to prepare paper-based laminates. The penetration of a Kraft paper sheet was characterized by a recently introduced, new device measuring the conductivity between both sides of the paper sheet after a drop of resin was placed on the surface and allowed to penetrate the sheet. The main target value measured was the time required for a specific resin to completely penetrate the defined paper sample (“penetration time”). This penetration time generally depends on the molecular weight distribution, the flow behavior and the polarity of the resin which in turn are dependent on the manufacturing conditions of the resin. In the present study, the influences of the three process factors: (1) type of lignin material used for substitution, (2) lignin modification by phenolation and (3) degree of phenol substitution on the penetration times of various lignin-phenolic hybrid impregnation resins were studied using a complete twolevel three-factorial experimental design. Thin laminates made with the resins diluted in methanol were mechanically tested in terms of tensile and flexural strains, and their cross-sections were studied by light microscopy.
Here, the effects of substituting portions of fossil-based phenol in phenol formaldehyde resin by renewable lignin from two different sources are investigated using a factorial screening experimental design. Among the resins consumed by the wood-based industry, phenolics are one of the most important types used for impregnation, coating or gluing purposes. They are prepared by condensing phenol with formaldehyde (PF). One major use of PF is as matrix polymer for decorative laminates in exterior cladding and wet-room applications. Important requirements for such PFs are favorable flow properties (low viscosity), rapid curing behavior (high reactivity) and sufficient self-adhesion capacity (high residual curing potential). Partially substituting phenol in PF with bio-based phenolic co-reagents like lignin modifies the physicochemical properties of the resulting resin. In this study, phenol-formaldehyde formulations were synthesized where either 30% or 50% (in weight) of the phenol monomer were substituted by either sodium lignosulfonate or Kraft lignin. The effect of modifying the lignin material by phenolation before incorporation into the resin synthesis was also investigated. The resins so obtained were characterized by Fourier Transform Infra-Red (FTIR) spectroscopy, Size Exclusion Chromatography (SEC), Differential Scanning Calorimetry (DSC), rheology, and measurements of contact angle and surface tension using the Wilhelmy plate method and drop shape analysis.
In digital transformierten Arbeitswelten organisieren die Mitarbeitenden ihre Arbeitszeit, ihren Arbeitsort und die Art und Weise, wie sie Aufgaben erledigen, in größerem Umfang selbst. Unternehmen, die im Zuge des Transformationsprozesses den Grad der Selbstorganisation erhöhen möchten, stehen vor einer komplexen Herausforderung. Selbstorganisation betrifft zahlreiche Elemente der Organisation wie Arbeitsaufgaben und Rollen, Führung, Regeln und Kompetenzen. Auf Basis eines empirisch entwickelten Bezugsrahmens, dem Digitalisierungsatlas, können die verschiedenen Elemente integrativ betrachtet und die Wechselwirkungen zwischen den Dimensionen in den Blick genommen werden. Wird Selbstorganisation ausgehend von der Autonomie der Beschäftigten, Arbeitsaufgaben und die eigene Rolle in der Organisation selbst zu beeinflussen, in den Blick genommen, sind insbesondere die Wechselwirkungen zwischen den organisationalen Dimensionen sowie Führung relevant. Die Spannungen zwischen diesen Dimensionen werden näher fokussiert. Insgesamt zeigt der Beitrag auf, dass Selbstorganisation nicht als ein unabhängiges Phänomen verstanden werden kann, sondern stets in Wechselwirkung mit anderen Dimensionen steht.
In modernen Arbeitswelten werden zunehmend arbeitsplatzbezogene digitale Technologien eingesetzt. Wenngleich dies zahlreiche Chancen bietet, kann es auch negative Folgen für die Gesundheit von Mitarbeitenden haben. Diese Herausforderungen werden durch die aktuelle Corona-Krise für viele Unternehmen noch verschärft. Stress, der direkt oder indirekt durch den Einsatz von Technologien entsteht, wird als «Technostress» bezeichnet. Wichtige Hebel zu dessen Vermeidung umfassen die Gestaltung von Technologien sowie die Berücksichtigung verschiedener individueller und situativer Faktoren im Rahmen technologischer Veränderungsprozesse.
Facial expressions play a dominant role in facilitating social interactions. We endeavor to develop tactile displays to reinstate facial expression modulated communication. The high spatial and temporal dimensionality of facial movements poses a unique challenge when designing tactile encodings of them. A further challenge is developing encodings that are at-tuned to the perceptual characteristics of our skin. A caveat of using vibrotactile displays is that tactile stimuli have been shown to induce perceptual tactile aftereffects when used on the fingers, arm and face. However, at present, despite the prevalence of waist-worn tactile displays, no such investigations of tactile aftereffects at the waist region exist in the literature, though they are warranted by the unique sensory and perceptual signalling characteristics of this area. Using an adaptation paradigm we investigated the presence of perceptual tactile aftereffects induced by continuous and burst vibrotactile stimuli delivered at the navel, side and spinal regions of the waist. We report evidence that the tactile perception topology of the waist is non-uniform, and specifically that the navel and spine regions are resistant to adaptive aftereffects while side regions are more prone to perceptual adaptations to continuous but not burst stimulations. Results of our current investigations highlight the unique set of challenges posed by designing waist-worn tactile displays. These and future perceptual studies can directly inform more realistic and effective implementations of complex high-dimensional spatiotemporal social cues.
Im Rahmen des Forschungsprojektes wurden Ausrüstungsmittel und -verfahren entwickelt, die dem vorbeugenden Schutz von Textilien (insbesondere Bodenbelägen) vor Anschmutzung dienen. Das Verfahren sieht eine kombinierte Ausrüstung von Textilien mit fluorierten Polymeren mit inkorporierten Nanopartikeln (in erster Linie: SiO2) zur Erhöhung der Rauhigkeit vor. Es wurden kommerziell erhältliche Hydrophobiermittel (Fluorcarbon- oder Kohlenwasserstoff-basierte Polymere) in Kombination mit SiO2-Nanopartikeln auf Teppiche aufgebracht und hinsichtlich eines Anschmutzens - z.B. durch Kaffee, KoolAid, Rotwein, AATCC Standard Soil, schwarze Schuhcreme - untersucht. Hierzu wurden die scherempfindlichen Dispersionen der Hydrophobiermittel mit neu entwickelten angepassten Dispersionen von SiO2-Nanopartikel versetzt. Die SiO2-Nanopartikel wurden mit systematisch variierten Größen von 10-1.000 nm synthetisiert, umfassend charakterisiert und mit Hilfe von neu entwickelten Fluormethacrylat-Copolymeren mit reaktiven Gruppen (Maleinsäure-, Itaconsäure- oder Citraconsäureanhydrid) und hydrophilen Modifiern (Alkohol- oder Amingruppen) stabilisiert. Die resultierenden Polymer-Teilchen-Dispersionen konnten aus wässrigen oder ethanolisch-wässrigen Lösungen auf Textilien (PA-, PES- oder WO-Teppiche und -Gewebe) appliziert werden. Weiterhin wurden auch die neu entwickelten Fluorcarbon-Polymere hinsichtlich ihrer Anwendung getestet. In Anschmutzungsversuchen wiesen die so ausgerüsteten Teppiche ein geringeres Anschmutzen durch Standardschmutz als Referenzmaterialien auf. Die Beständigkeit der Ausrüstung bei mechanischer Belastung konnte durch Vernetzung der Polymere auf dem Textilmaterial verbessert werden. Für PA 6- und PA 6.6-Teppiche wurden die besten Ergebnisse hinsichtlich eines geringeren Anschmutzens durch wasserlösliche Verschmutzungen (Kaffee, Rotwein, KoolAid) im Vergleich zu unbehandelten Teppichen ermittelt, wenn die Ausrüstung mit Fluorpolymer-stabilisierten SiO2-Nanopartikeln oder mit einer kombinierten Dispersion aus SiO2-Partikeln und Fluorcarbonharzen vorgenommen wurde. Eine im Vergleich zu unbehandelten Teppichen weniger starke Anschmutzung durch AATCC Standard Soil (DIN EN ISO 11378-2) wurde für mit SiO2-Partikeln behandelte PA 6-Teppiche ermittelt. Hydrophobe Anschmutzungen (z.B. schwarze Schuhcreme) konnten von mit Fluorcarbon-Polymeren ausgerüsteten Teppichen am besten entfernt werden. Die Kombination von SiO2-Partikeln mit Fluorcarbon-Polymeren erwies sich meist als günstiger als die alleinige Behandlung mit Fluorcarbonharzen. Ein Zusammenhang zwischen der Größe der Nanopartikel, der Abrasionsbeständigkeit und den Reinigungseigenschaften wurde festgestellt, und es konnte gezeigt werden, dass FC-Nanopartikel-Composites diese verbessern. Die mechanische Beständigkeit der Antischmutzausrüstung mit SiO2-Nanopartikeln und Fluorcarbon-Polymeren auf Polyamidteppichen wurde z.B. durch Hexapod-Trommelbeanspruchung (nach ISO 10361) geprüft. Durch REM, IR-Spektroskopie und den Wassertropfentest wurde nach 4.000 und auch nach 12.000 Touren noch eine intakte Beschichtung nachgewiesen. Mit Vernetzern, die das Polymer selbst, das Polymer mit Partikeln und/oder der Substratoberfläche vernetzen, konnte z.T. die Abrasionsbeständigkeit verbessert werden (hier müssen ggf. optimalere Vernetzer gesucht werden).
Textil im Verbund ist besser? Das ist in der Fachwelt lange keine Frage mehr, textile Verbundwerkstoffe können viele Vorteile bieten. Es ist wohl bekannt, dass sie oft besser sind als nicht-textile Alternativen. Die Beispiele sind mannigfaltig. Innovative Entwicklungen sind nicht nur der stark beachtete Textilbeton, der mit dem Deutschen Zukunftspreis ausgezeichnet wurde, sondern auch viele vielleicht weniger wahrgenommene oder spektakuläre Produkte auf Basis faserverstärkter Kunststoffe.
Are textile structures better? In the professional world, there is no doubt that textile composites can offer many advantages. It is well known that they are often better than non-textile alternatives. There are manifold examples. Innovative developments are not only the popular textile reinforced concrete which was awarded with the Deutscher Zukunftspreis (German Future Award) but also a huge number of probably less perceived or spectacular products based on fiber-reinforced plastics.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Hardly any software development process is used as prescribed by authors or standards. Regardless of company size or industry sector, a majority of project teams and companies use hybrid development methods (short: hybrid methods) that combine different development methods and practices. Even though such hybrid methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this article, we make a first step towards a statistical construction procedure for hybrid methods. Grounded in 1467 data points from a large‐scale practitioner survey, we study the question: What are hybrid methods made of and how can they be systematically constructed? Our findings show that only eight methods and few practices build the core of modern software development. Using an 85% agreement level in the participants' selections, we provide examples illustrating how hybrid methods can be characterized by the practices they are made of. Furthermore, using this characterization, we develop an initial construction procedure, which allows for defining a method frame and enriching it incrementally to devise a hybrid method using ranked sets of practice.
A methodology for designing planar spiral antennas with a feeding network embedded within a dielectric is presented. To avoid a purely academic work which may not be manufactured with available standard technologies, the approach takes into account manufacturing process requirements by choice of used materials in the simulation. General design rules are provided. They encompass amongst others, selection criteria for dielectric material, aspects to consider when sketching the radiating element design, as well as those for the implementation of the feeding network. A rule of thumb, which maybe helpful in the determination of the antenna supporting substrate’s height, has been found. The appeal of the method resides in the fact that it eases up the design process and helps to minimize errors, saving time and money. The approach also enables the design of a compact and small-size spiral antenna as antenna-in-package (AiP), and provides the opportunity to assemble the antenna with other RF components/systems on the same layer stack or on the same integration platform.
Purpose: Despite growing interest in the intersection of supply chain management (SCM) and management accounting (MA) in the academic debate, there is a lack of understanding regarding both the content and the delimitation of this topic. As of today, no common conceptualization of supply chain management accounting (SCMA) exists. The purpose of this study is to provide an overview of the research foci of SCMA in the scholarly debate of the past two decades. Additionally, it analyzes whether and to what extent the academic discourse of MA in SCs has already found its way into both SCM and MA higher education, respectively.
Design/methodology/approach: A content analysis is conducted including 114 higher education textbooks written in English or in German language.
Findings: The study finds that SC-specific concepts of MA are seldom covered in current textbooks of both disciplines. The authors conclude that although there is an extensive body of scholarly research about SCMA concepts, there is a significant discrepancy with what is taught in higher education textbooks.
Practical implications: There is a large discrepancy between the extensive knowledge available in scholarly research and what we teach in both disciplines. This implies that graduates of both disciplines lack important knowledge and skills in controlling and accounting for SCs. To bring about the necessary change, MA and SCM in higher education must be more integrative.
Originality/value: To the best of the authors knowledge, this study is first of its kind comprising a large textbook sample in both English and German languages. It is the first substantiated assessment of the current state of integration between SCM and MA in higher education.
Companies compete more and more as integrated supply chains rather than as individual firms. The success of the entire supply chain determines the economic well-being of the individual company. With management attention shifting to supply chains, the role of management accounting naturally must extend to the cross-company layer as well. This book demonstrates how management accounting can make a significant contribution to supply chain success.It targets students who are already familiar with the fundamentals of accounting and now want to extend their expertise in the field of cross company (or network) management accounting. Practitioners will draw valuable insights from the text as well.