Refine
Document Type
- Journal article (199)
- Book chapter (21)
- Conference proceeding (18)
- Anthology (2)
- Book (1)
- Review (1)
- Working Paper (1)
Is part of the Bibliography
- yes (243)
Institute
- ESB Business School (96)
- Life Sciences (66)
- Informatik (47)
- Technik (25)
- Texoversum (9)
Publisher
- Elsevier (243) (remove)
Rare but extreme events, such as pandemics, terror attacks, and stock market collapses, pose a risk that could undermine cooperation in societies and groups. We extend the public goods game (PGG) to investigate the relationship between rare but extreme external risks and cooperation in a laboratory experiment. By incorporating risk as an external random variable in the PGG, independent of the participants’ contributions, we preserve the economic equilibrium of non-cooperation in the original game. Furthermore, we examine whether cooperation can be restored by the relatively simple intervention of informing about countermeasures while keeping the actual risk constant. Our experimental results reveal that on average extreme risks indeed decrease contributions by about 20%; however, countermeasure information increases contributions by about 10%. Specifically, in the first interactions, cooperation levels can even reach those observed in the riskless baseline. Our results suggest that countermeasure information could help reinforce social cohesion and resilience in the face of rare but extreme risks.
Wave-like differential equations occur in many engineering applications. Here the engineering setup is embedded into the framework of functional analysis of modern mathematical physics. After an overview, the –Hilbert space approach to free Euler–Bernoulli bending vibrations of a beam in one spatial dimension is investigated. We analyze in detail the corresponding positive, selfadjoint differential operators of 4-th order associated to the boundary conditions in statics. A comparison with free string wave swinging is outlined.
Automatic content creation system for augmented reality maintenance applications for legacy machines
(2024)
Augmented reality (AR) applications have great potential to assist maintenance workers in their operations. However, creating AR solutions is time-consuming and laborious, which limits its widespread adoption in the industry. It therefore often happens that even with the latest generation machines, instead of an AR solution, the user only receives an electronic manual for the equipment operation and maintenance. This is commonplace with legacy machines. For this reason, solutions are required that simplify the creation of such AR solutions. This paper presents an approach using an electronic manual as a basis to create fast and cost-effective AR solutions for maintenance. As part of the approach, an application was developed to automatically identify and subdivide the chapters of electronic manuals via the bookmarks in the table of contents. The contents are then automatically uploaded to a central server and indexed with a suitable marker to make the data retrievable. The prepared content can then be accessed for creating context-related AR instructions via the marker. The application is characterized by the fact that no developers or experts are required to prepare the information. In addition to complying with common design criteria, the clear presentation of the contents and the intuitive use of the system offer added value for the performance of maintenance tasks. Together, these two elements form a novel way to retrofit legacy machines with AR maintenance instructions. The practical validation of the system took place in a factory environment. For this purpose, the content was created for a filter change on a CNC milling machine. The results show that inexperienced users can extract appropriate content with the software application. Furthermore, it is shown that maintenance workers, can access the content with an AR application developed for the Microsoft HoloLens 2 and complete simple tasks provided in the manufacturer's electronic manual.
Plasmonics and nanophotonics both deal with the interaction of light with structures of typically sub-wavelength size in one of more dimensions. Over the past decade or two, interest in these topics has grown significantly. This includes basic research towards detailed understanding of light-matter interaction and the manipulation of light on the nanometer scale as well as the search for applications ranging from quantum information processing, data storage, solar cells, spectroscopy and microscopy to (bio-)sensors and biomedical devices. Key enablers for this development are advanced materials and the variety of techniques to structure them with nanometer precision on the one hand, and progress in the theoretical description and numerical implementations, on the other. Besides the traditional metals Au, Ag, Al, and Cu also compounds such as refractory metal nitrides with much higher durability as well as semiconductors, dielectrics and hybrid structures have become of interest. Structuring techniques are not only aiming at the fabrication of individual elements with highest precision for detailed interaction analysis, but also at methods for large scale, low-cost nanofabrication mostly for sensor applications. In the former case, mostly electron beam lithography and focused ion beam milling are employed, while for high throughput various forms of nanoimprint and self-assembly based techniques are favored. Thin film deposition and pattern transfer techniques are mostly derived from those developed for nano-electronics, however more recently methods such as electroless plating, atomic layer deposition or etching and 3-D additive techniques are appearing. Thus, highly specialized expertise has been acquired in the different disciplines, and successful research and technology transfer will draw from this pool of knowledge.
Different network architectures are being used to build remote laboratories. Historically, it has been difficult to integrate industrial control systems with higher level IT systems like enterprise resource planning (ERP), manufacturing execution systems (MES), and manufacturing operations management (MOM). Getting these systems to communicate with one another has proven to be relatively difficult due to the absence of shared protocols between them. The Open Platform Communications United Architecture (OPC-UA) protocol was introduced as a remedy for this issue and is gaining popularity, but what if open-source protocols that are widely used in the IT industry could be used instead? This paper presents the development of an IT-Architecture for a cyber-physical industrial control systems laboratory that enables a seamless interconnection and integration of its elements. The architecture utilises Node-Red technology. Node-RED is an open-source programming platform developed by IBM that is focused on making it simple to link physical components, APIs, and web services. This cyber-physical laboratory is for learning principles of an industrial cascaded process control factory. Finally, this text will also discuss future work relating to digital twin (DT). A coupled tank system is selected as a teaching factory to illustrate a range of fluid control application in a typical chemical process factory.
In the context of Industry 4.0, intralogistics faces an increasingly complex and dynamic environment driven by a high level of product customisation and complex manufacturing processes. One approach to deal with these changing conditions is the decentralised and intelligent connectivity of intralogistics systems. However, wireless connectivity presents a major challenge in the industry due to strict requirements such as safety and real-time data transmission. In this context, the fifth generation of mobile communications (5G) is a promising technology to meet the requirements of safety-critical applications. Particularly, since 5G offers the possibility of establishing private 5G networks, also referred to as standalone non-public networks. Through their isolation from public networks, private 5G networks provide exclusive coverage for private organisations offering them high intrinsic network control and data security. However, 5G is still under development and is being gradually introduced in a continuous release process. This process lacks transparency regarding the performance of 5G in individual releases, complicating the successful adoption of 5G as an industrial communication. Additionally, the evaluation of 5G against the specified target performance is insufficient due to the impact of the environment and external interfering factors on 5G in the industrial environment. Therefore, this paper aims to develop a technical decision-support framework that takes a holistic approach to evaluate the practicality of 5G for intralogistics use cases by considering two fundamental stages. The first of these analyses technical parameters and characteristics of the use case to evaluate the theoretical feasibility of 5G. The second stage investigates the application's environment, which substantially impacts the practicality of 5G, for instance, the influence of surrounding materials. Finally, a case study validates the proposed framework by means of an autonomous mobile robot. As a result, the validation proves the proposed framework's applicability and shows the practicality of the autonomous mobile robot, when integrating it into a private 5G network testbed.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
The present study investigated the possibilities and limitations of using a low-cost NIR spectrometer for the verification of the presence of the declared active pharmaceutical ingredients (APIs) in tablet formulations, especially for medicine screening studies in low-resource settings. Spectra from 950 to 1650 nm were recorded for 170 pharmaceutical products representing 41 different APIs, API combinations or placebos. Most of the products, including 20 falsified medicines, had been collected in medicine quality studies in African countries. After exploratory principal component analysis, models were built using data-driven soft independent modelling of class analogy (DD-SIMCA), a one-class classifier algorithm, for tablet products of penicillin V, sulfamethoxazole/trimethoprim, ciprofloxacin, furosemide, metronidazole, metformin, hydrochlorothiazide, and doxycycline. Spectra of amoxicillin and amoxicillin/clavulanic acid tablets were combined into a single model. Models were tested using Procrustes cross-validation and by projection of spectra of tablets containing the same or different APIs. Tablets containing no or different APIs could be identified with 100 % specificity in all models. A separation of the spectra of amoxicillin and amoxicillin/clavulanic acid tablets was achieved by partial least squares discriminant analysis. 15 out of 19 external validation products (79 %) representing different brands of the same APIs were correctly identified as members of the target class; three of the four rejected samples showed an API mass percentage of the total tablet weight that was out of the range covered in the respective calibration set. Therefore, in future investigations larger and more representative spectral libraries are required for model building. Falsified medicines containing no API, incorrect APIs, or grossly incorrect amounts of the declared APIs could be readily identified. Variation between different NIR-S-G1 spectroscopic devices led to a loss of accuracy if spectra recorded with different devices were pooled. Therefore, piecewise direct standardization was applied for calibration transfer. The investigated method is a promising tool for medicine screening studies in low-resource settings.
The Circular Economy aims to reintroduce the value of products back into the economic cycle at the same value chain level. While the activities of the Circular Economy are already well-defined, there exists a gap in how returned products are treated by the industry. This study aims to examine how a process should be designed to handle returned products in the context of the Circular Economy. To achieve this, a machine learning-based algorithm is used to classify data and extract relevant information throughout the product life cycle. The focus of this research is limited to land transportation systems within the Sharing Economy sector.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
Context
In a world of high dynamics and uncertainties, it is almost impossible to have a long-term prediction of which products, services, or features will satisfy the needs of the customer. To counter this situation, the conduction of Continuous Improvement or Design Thinking for product discovery are common approaches. A major constraint in conducting product discovery activities is the high effort to discover and validate features and requirements. In addition, companies struggle to integrate product discovery activities into their agile processes and iterations.
Objective
This paper aims at suggests a supportive tool, the “Discovery Effort Worthiness (DEW) Index”, for product owners and agile teams to determine a suitable amount of effort that should be spent on Design Thinking activities. To operationalize DEW, proposals for practitioners are presented that can be used to integrate product discovery into product development and delivery.
Method
A case study was conducted for the development of the DEW index. In addition, we conducted an expert workshop to develop proposals for the integration of product discovery activities into the product development and delivery process.
Results
First, we present the "Discovery Effort Worthiness Index" in form of a formula. Second, we identified requirements that must be fulfilled for systematic integration of product discovery activities into product development and delivery. Third, we derived from the requirements proposals for the integration of product discovery activities with a company's product development and delivery.
Conclusion
The developed "Discovery Effort Worthiness Index" provides a tool for companies and their product owners to determine how much effort they should spend on Design Thinking methods to discover and validate requirements. Integrating product discovery with product development and delivery should ensure that the results of product discovery are incorporated into product development. This aims to systematically analyze product risks to increase the chance of product success.
The Industry 4.0 paradigm requires concepts for integrating intelligent/ smart IoT Solutions into manufacturing. Such intelligent solutions are envisioned to increase flexibility and adaptability in smart factories. Especially autonomous cobots capable of adapting to changing conditions are a key enabler for changeable factory concepts. However, identifying the requirements and solution scenarios incorporating intelligent products challenges the manufacturing industry, especially in the SME sector. In pick and place scenarios, changing coordinate systems of workpiece carriers cause placing process errors. Using the IPIDS framework, this paper describes the development of a tool-center-point positioning method to improve the process stability of a collaborative robot in a changeable assembly workstation. Applying the framework identifies the requirement for an intelligent workpiece carrier as a part of the solution. Implementing and evaluating the solution within a changeable factory validates the IPIDS framework.
Due to constantly changing conditions, demand, and technologies, companies increasingly seek flexibility. Productivity results from automation, improved working conditions and the focus of people in production in interaction with machines. Unfortunately, the human factor is often not considered to increase flexibility and productivity with new concepts. This work aims to develop a hybrid assistance system that allows a dynamic configuration of cyber-physical production systems considering the current order situation and available resources utilizing simulation. The system also considers human factors in addition to economic factors, which contributes to the extended economic appraisal.
Film formation of self synthesized Polymer EPM–g–VTMDS (ethylene–propylene rubber, EPM, grafted with vinyltetramethyldisiloxane, VTMDS) was studied regarding bonding to adhesion promoter vinyltrimethoxysilane (VTMS) on oxidized 18/10 chromium/nickel–steel (V2A) stainless steel surfaces. Polymer films of different mixed solutions including commercial siloxane and silicone, dimethyl, vinyl group terminated crosslinker (HANSA SFA 42100, CAS# 68083-19-2, 0.35 mmol Vinyl/g) and platinum, 1,3-diethenyl-1,1,3,3-tetramethyldisiloxane complex Karstedt's catalyst (ALPA–KAT 1, CAS# 68478-92-2) were spin coated on V2A stainless steel surfaces with adsorbed VTMS thin layers in order to analyze film formation of EPM–g–VTMDS at early stages. Surface topography and chemical bonding of the high performance polymers on different oxidized V2A surfaces were investigated with X–ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM), scanning electron microscopy (SEM) and surface enhanced Raman spectroscopy (SERS). AFM and SEM as well as XPS results indicated that the formation of the polymer film proceeds via growth of polymer islands. Chemical signatures of the essential polymer contributions, linker and polymer backbones, could be identified using XPS core level peak shape analysis and also SERS. The appearance of signals which are related to Si–O–Si can be seen as a clear indication of lateral crosslinking and silica network formation in the films on the V2A surface.
Mesoporous silica microspheres (MPSMs) find broad application as separation materials in high liquid chromatography (HPLC). A promising preparation strategy uses p(GMA-co-EDMA) as hard templates to control the pore properties and a narrow size distribution of the MPMs. Here six hard templates were prepared which differ in their porosity and surface functionalization. This was achieved by altering the ratio of GMA to EDMA and by adjusting the proportion of monomer and porogen in the polymerization process. The various amounts of GMA incorporated into the polymer network of P1-6 lead to different numbers of tetraethylene pentamine in the p(GMA-co-EDMA) template. This was established by a partial least squares regression (PLS-R) model, based on FTIR spectra of the templates. Deposition of silica nanoparticles (SNP) into the template under Stoeber conditions and subsequent removal of the polymer by calcination result in MPSM1-6. The size of the SNPs and their incorporation depends on the pore parameters of the template and degree of TEPA functionalization. Moreover, the incorporated SNPs construct the silica network and control the pore parameters of the MPSMs. Functionalization of the MPSMs with trimethoxy (octadecyl) silane allows their use as a stationary phase for the separation of biomolecules. The pore characteristics and the functionalization of the template determine the pore structure of the silica particles and, consequently, their separation properties.
The fifth generation of mobile communication (5G) is a wireless technology developed to provide reliable, fast data transmission for industrial applications, such as autonomous mobile robots and connect cyber-physical systems using Internet of Things (IoT) sensors. In this context, private 5G networks enable the full performance of industrial applications built on dedicated 5G infrastructures. However, emerging wireless communication technologies such as 5G are a complex and challenging topic for training in learning factories, often lacking physical or visual interaction. Therefore, this paper aims to develop a real-time performance monitoring system of private 5G networks and different industrial 5G devices to visualise the performance and impact factors influencing 5G for students and future connectivity experts. Additionally, this paper presents the first long-term measurements of private 5G networks and shows the performance gap between the actual and targeted performance of private 5G networks.
Since its first publication in 2015, the learning factory morphology has been frequently used to design new learning factories and to classify existing ones. The structuring supports the concretization of ideas and promotes exchange between stakeholders.
However, since the implementation of the first learning factories, the learning factory concept has constantly evolved.
Therefore, in the Working Group "Learning Factory Design" of the International Association of Learning Factories, the existing morphology has been revised and extended based on an analysis of the trends observed in the evolution of learning factory concepts. On the one hand, new design elements were complemented to the previous seven design dimensions, and on the other hand, new design dimensions were added. The revised version of the morphology thus provides even more targeted support in the design of new learning factories in the future.
High-performance liquid chromatography is one of the most important analytical tools for the identification and separation of substances. The efficiency of this method is largely determined by the stationary phase of the columns. Although monodisperse mesoporous silica microspheres (MPSM) represent a commonly used material as stationary phase their tailored preparation remains challenging. Here we report on the synthesis of four MPSMs via the hard template method. Silica nanoparticles (SNPs) which form the silica network of the final MPSMs were generated in situ from tetraethyl orthosilicate (TEOS) in the presence of (3-aminopropyl) triethoxysilane (APTES) functionalized p(GMA-co-EDMA) as hard template. Methanol, ethanol, 2-propanol, and 1-butanol were applied as solvents to control the size of the SNPs in the hybrid beads (HB). After calcination, MPSMs with different sizes, morphology and pore properties were obtained and characterized by scanning electron microscopy, nitrogen adsorption and desorption measurements, thermogravimetric analysis, solid state NMR and DRIFT IR spectroscopy. Interestingly, the 29Si NMR spectra of the HBs show T and Q group species which suggests that there is no covalent linkage between the SNPs and the template. The MPSMs were functionalized with trimethoxy (octadecyl) silane and used as stationary phases in reversed-phase chromatography to separate a mixture of eleven different amino acids. The separation characteristics of the MPSMs strongly depend on their morphology and pore properties which are controlled by the solvent during the preparation of the MPSMs. Overall, the separation behavior of the best phases is comparable with those of commercially available columns. The phases even achieve faster separation of the amino acids without loss of quality.
The increase in product variance and shorter product lifecycles result in higher production ramp-up frequencies and promote the usage of mixed-model lines. The ramp-up is considered a critical step in the product life cycle and in the automotive industry phases of the ramp-up are often executed on separated production lines (pilot lines) or factories (pilot plants) to verify processes and to qualify employees without affecting the production of other products in the mixed-model line. The required financial funds for planning and maintaining dedicated pilot lines prevent small and medium-sized enterprises (SMEs) from the application. Hence, SMEs require different tools for piloting and training during the production ramp-up. Learning islands on which employees can be trained through induced and autonomous learning propose a solution. In this work, a concept for the development and application which contains the required organization, activities, and materials is developed through expert interviews. The results of a case study application with a medium-sized automotive manufacturer show that learning islands are a viable tool for employee qualification and process verification during the ramp-up of mixed-model lines.
Monitoring heart rate and breathing is essential in understanding the physiological processes for sleep analysis. Polysomnography (PSG) system have traditionally been used for sleep monitoring, but alternative methods can help to make sleep monitoring more portable in someone's home. This study conducted a series of experiments to investigate the use of pressure sensors placed under the bed as an alternative to PSG for monitoring heart rate and breathing during sleep. The following sets of experiments involved the addition of small rubber domes - transparent and black - that were glued to the pressure sensor. The resulting data were compared with the PSG system to determine the accuracy of the pressure sensor readings. The study found that the pressure sensor provided reliable data for extracting heart rate and respiration rate, with mean absolute errors (MAE) of 2.32 and 3.24 for respiration and heart rate, respectively. However, the addition of small rubber hemispheres did not significantly improve the accuracy of the readings, with MAEs of 2.3 bpm and 7.56 breaths per minute for respiration rate and heart rate, respectively. The findings of this study suggest that pressure sensors placed under the bed may serve as a viable alternative to traditional PSG systems for monitoring heart rate and breathing during sleep. These sensors provide a more comfortable and non-invasive method of sleep monitoring. However, the addition of small rubber domes did not significantly enhance the accuracy of the readings, indicating that it may not be a worthwhile addition to the pressure sensor system.
The Covid-19 virus has triggered a worldwide pandemic and therefore many employees were required to work from home which caused numerous challenges. With the Covid-19 pandemic now in its third year, there are already several studies available on the subject of home offices. To investigate the impact of remote work on employee satisfaction and trust, this quantitative study aims to review existing results and formulate hypotheses based on a conceptual model created through a qualitative study and extensive literature review. The research question is as follows: Does home office during Covid-19 affect employee satisfaction and trust? To test the hypotheses, a structural equation model was constructed and analyzed. The culture of trust and flexibility are identified as the biggest influencing factors in this study.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
Healthy sleep is one of the prerequisites for a good human body and brain condition, including general well-being. Unfortunately, there are several sleep disorders that can negatively affect this. One of the most common is sleep apnoea, in which breathing is impaired. Studies have shown that this disorder often remains undiagnosed. To avoid this, developing a system that can be widely used in a home environment to detect apnoea and monitor the changes once therapy has been initiated is essential. The conceptualisation of such a system is the main aim of this research. After a thorough analysis of the available literature and state of the art in this area of knowledge, a concept of the system was created, which includes the following main components: data acquisition (including two parts), storage of the data, apnoea detection algorithm, user and device management, data visualisation. The modules are interchangeable, and interfaces have been defined for data transfer, most of which operate using the MQTT protocol. System diagrams and detailed component descriptions, including signal requirements and visualisation mockups, have also been developed. The system's design includes the necessary concepts for the implementation and can be realised in a prototype in the next phase.
The influence of sleep on human health is enormous. Accordingly, sleep disorders can have a negative impact on it. To avoid this, they should be identified and treated in time. For this purpose, objective (with an appropriate device) or subjective (based on perceived values) measurement methods are used for sleep analysis to understand the problem. The aim of this work is to find out whether an exchange of the two methods is possible and can provide reliable results. In accordance with this goal, a study was conducted with people aged over 65 years old (a total of 154 night-time recordings) in which both measurement methods were compared. Sleep questionnaires and electronic devices for sleep assessment placed under the mattress were applied to achieve the study aims. The obtained results indicated that the correlation between both measurement methods could be observed for sleep characteristics such as total sleep time, total time in bed and sleep efficiency. However, there are also significant differences in absolute values of the two measurement approaches for some subjects/nights, which leads us to conclude that the substitution is more likely to be considered in case of long-term monitoring where the trends are of more importance and not the absolute values for individual nights.
Development of an expert system to overpass citizens technological barriers on smart home and living
(2023)
Adopting new technologies can be overwhelming, even for people with experience in the field. For the general public, learning about new implementations, releases, brands, and enhancements can cause them to lose interest. There is a clear need to create point sources and platforms that provide helpful information about the novel and smart technologies, assisting users, technicians, and providers with products and technologies. The purpose of these platforms is twofold, as they can gather and share information on interests common to manufacturers and vendors. This paper presents the ”Finde-Dein-SmartHome” tool. Developed in association with the Smart Home & Living competence center [5] to help users learn about, understand, and purchase available technologies that meet their home automation needs. This tool aims to lower the usability barrier and guide potential customers to clear their doubts about privacy and pricing. Communities can use the information provided by this tool to identify market trends that could eventually lower costs for providers and incentivize access to innovative home technologies and devices supporting long-term care.
Managerial accountants spend a large part of their working time on more operational activities in cost accounting, reporting, and operational planning and budgeting. In all these areas, there has been increasing discussion in recent years, both in theory and practice, about using more digital technologies. For reporting, this means not only an intensified discussion of technologies such as RPA and AI but also more intensive changes to existing reporting systems. In particular, management information systems (MIS), which are maintained by managerial accountants and used by managers for corporate management, should be mentioned here. Based on an empirical survey in a large German company, this article discusses the requirements and assessments of users when switching from a regular MIS to a cloud-based system.
The development of automatic solutions for the detection of physiological events of interest is booming. Improvements in the collection and storage of large amounts of healthcare data allow access to these data faster and more efficiently. This fact means that the development of artificial intelligence models for the detection and monitoring of a large number of pathologies is becoming increasingly common in the medical field. In particular, developing deep learning models for detecting obstructive apnea (OSA) events is at the forefront. Numerous scientific studies focus on the architecture of the models and the results that these models can provide in terms of OSA classification and Apnea-Hypopnea-Index (AHI) calculation. However, little focus is put on other aspects of great relevance that are crucial for the training and performance of the models. Among these aspects can be found the set of physiological signals used and the preprocessing tasks prior to model training. This paper covers the essential requirements that must be considered before training the deep learning model for obstructive sleep apnea detection, in addition to covering solutions that currently exist in the scientific literature by analyzing the preprocessing tasks prior to training.
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
Development of an IoT-based inventory management solution and training module using smart bins
(2023)
Flexibility, transparency and changeability of warehouse environments are playing an increasingly important role to achieve a cost-efficient production of small batch sizes. This results in increasing requirements for warehouses in terms of flexibility, scalability, reconfigurability and transparency of material and information flows to deal with large number of different components and variable material and information flows due to small batch sizes. Therefore, an IoT-based inventory management solution and training module has been developed, implemented and validated at Werk150 – the Factory on campus of the ESB Business School. Key elements of the developed solution are smart bins using weight mats to track the bin’s content and additional sensors and buttons which are connected to an IoT – Hub to collect data of material consumption and manual handling operations. The use of weight mats for the smart bins offers the possibility to measure the container content independent of the specific component geometry and thus for a variety of components based on the specific component weights. The developed solution enables focusing on key for success elements of the system to provide synchronization of the flow of materials and information resulting an increase of flexibility and significantly higher transparency of the material flow. AIbased algorithms are applied to analyse the gathered data and to initiate process optimizations by providing the logistics decision makers a profound and transparent basis for decision making. In order to provide students and industry visitors of the learning factory with the necessary competences and to support the transfer into practice, a training module on IoT-based inventory management was developed and implemented.
Circular economy aims to support reuse and extends the product life cycles through repair, remanufacturing, upgrades and retrofits, as well as closing material cycles through recycling. To successfully manage the necessary transformation processes to circular economy, manufacturing enterprises rely on the competency of their employees. The definition of competency requirements for circular economy-oriented production networks will contribute to the operationalization of circular economy. The International Association of Learning Factories (IALF) statesin its mission the development of learning systems addressing these challenges for training of students and further education of industry employees. To identify the required competencies for circular economy, the major changes of the product life cycle phases have been investigated based on the state of the science and compared to the socio-technical infrastructure and thematic fields of the learning factories considered in this paper. To operationalize the circular economy approach in the product design and production phase in learning factories, an approach for a cross learning factory network (so called "Cross Learning Factory Product Production System (CLFPPS)") has been developed. The proposed CLFPPS represents a network on the design dimensions of learning factories. This approach contributes to the promotion of circular economy in learning factories as it makes use of and combines the focus areas of different learning factories. This enables the CLFPPS to offer a holistic view on the product life cycle in production networks.
The world is becoming increasingly digital. People have become used to learning and interacting with the world around them through technology, accelerated even further by the Covid-19 pandemic. This is especially relevant to the generation currently entering education systems and the workforce. Considering digital aids and methods of learning are important for future learning. The increasing online learning needs open the case for integrating digital learning aspects such as serious gaming within education and training systems. Learning factories fall amongst the education and training systems that can benefit from integration with digital learning extensions. Digital capabilities such as digital twins and models further enable the exploration of integrating digital serious games as an extension of learning factories. Since learning factories are meant for a range of different learning, training, and research purposes, such serious games need to be adaptable across stakeholder perspectives to maximize the value gained from the time and cost invested into such design and development. Research into the development of adaptive serious games for multiple stakeholder perspectives must first determine whether such development can be developed that reaches the objectives set for different included stakeholder perspectives. The purpose of this research is to investigate this at the hand of the practical development of a digital adaptive serious game for stakeholder perspectives.
Product engineering and subsequent phases of product lifecycles are predominantly managed in isolation. Companies therefore do not fully exploit potentials through using data from smart factories and product usage. The novel intelligent and integrated Product Lifecycle Management (i²PLM) describes an approach that uses these data for product engineering. This paper describes the i²PLM, shows the cause-and-effect relationships in this context and presents in detail the validation of the approach. The i²PLM is applied and validated on a smart product in an industrial research environment. Here, the subsequent generation of a smart lunchbox is developed based on production and sensor data. The results of the validation give indications for further improvements of the i²PLM. This paper describes how to integrate the i²PLM into a learning factory.
This article examines the risks and societal costs associated with flexible average inflation targeting in the United States and symmetric inflation targeting in the Eurozone. Employing an empirical approach, we analyze monthly cumulative inflation gaps over a monetary policy horizon of 36 months. By investigating the trajectories of the cumulative inflation gaps, we find a heavy tailed distribution and a 20 percent probability of over- and undershooting the inflation target. We exhibit that the offsetting mechanism introduced in the revised monetary strategies lack credibility in ensuring price stability during a period of persistent inflation. Consequently, the credibility of central banks may be compromised. The policy implications are the integration of an escape clause and prompt monetary corrections in cases where the inflation goal is not achieved. This study provides insights for policymakers and central banks, emphasizing challenges in maintaining credibility and price stability within the new monetary strategies.
In the course of a more intensive energy generation from regenerative sources, an increased number of energy storages is required. In addition to the widespread means of storing electric energy, storing energy thermally can contribute significantly. However, limited research exists on the behaviour of thermal energy storages (TES) in practical operation. While the physical processes are well known, it is nevertheless often not possible to adequately evaluate its performance with respect to the quality of thermal stratification inside the tank, which is crucial for the thermodynamic effectiveness of the TES. The behaviour of a TES is experimentally investigated in cyclic charging and discharging operation in interaction with a cogeneration (CHP) unit at a test rig in the lab. From the measurements the quality of thermal stratification is evaluated under varying conditions using different metrics such as normalised stratification factor, modified MIX number, exergy number and exergy efficiency, which extends the state of art for CHP applications. The results show that the positioning of the temperature sensors for turning the CHP unit on and off has a significant influence on both the effective capacity of a TES and the quality of thermal stratification inside the tank. It is also revealed that the positioning of at least one of these sensors outside the storage tank, i.e. in the return line to the CHP unit, prevents deterioration of thermal stratification, thereby enhancing thermodynamic effectiveness. Furthermore, the effects of thermal load and thermal load profile on effective capacity and thermal stratification are discussed, even though these are much smaller compared to the effect of positioning the temperature sensors.
The properties of polyelectrolyte multilayers are ruled by the process parameters employed during self-assembly. This is the first study in which a design of experiment approach was used to validate and control the production of ultrathin polyelectrolyte multilayer coatings by identifying the ranges of critical process parameters (polyelectrolyte concentration, ionic strength and pH) within which coatings with reproducible properties (thickness, refractive index and hydrophilicity) are created. Mathematical models describing the combined impact of key process parameters on coatings properties were developed demonstrating that only ionic strength and pH affect the coatings thickness, but not polyelectrolyte concentration. While the electrolyte concentration had a linear effect, the pH contribution was described by a quadratic polynomial. A significant contribution of this study is the development of a new approach to estimate the thickness of polyelectrolyte multilayer nanofilms by quantitative rhodamine B staining, which might be useful in all cases when ellipsometry is not feasible due to the shape complexity or small size of the coated substrate. The novel approach proposed here overcomes the limitations of known methods as it offers a low spatial sampling size and the ability to analyse a wide area without restrictions on the chemical composition and shape of the substrate.
Twitter and citations
(2023)
Social media, especially Twitter, plays an increasingly important role among researchers in showcasing and promoting their research. Does Twitter affect academic citations? Making use of Twitter activity about columns published on VoxEU, a renowned online platform for economists, we develop an instrumental variable strategy to show that Twitter activity about a research paper has a causal effect on the number of citations that this paper will receive. We find that the existence of at least one tweet, as opposed to none, increases citations by 16-25%. Doubling overall Twitter engagement boosts citations by up to 16%.
Towards a sustainable future, looking beyond the system boundaries of a single manufacturing company is necessary to promote meaningful collaborations in terms of circular economy principles. In this context digital data processing technologies to connect the potential collaborators are seen as enablers to make use of proven collaborative circular business models (CCBMs). Since most of such data processing technologies rely on features to describe the entities involved, it is essential to provide guidance for identifying and selecting the relevant and most appropriate ones. Defining critical success factors (CSFs) is considered a suitable instrument to describe the decisive factors. A systematic literature review (SLR), followed by a qualitative synthesis is investigating two scientific fields of work, namely (1) the general relevant features of CCBMs and, (2) methodologies for determining CSFs. This results in the development of a conceptual framework which provides guidance for digital applications that perform further digital processing based on the relevant CSFs relating to the specific CCBM.
Machine failures’ consequences – a classification model considering ultra-efficiency criteria
(2023)
To strive for a sustainable production, maintenance has to evaluate possible machine failure consequences not just economically but also holistically. Approaches such as the ultra-efficiency factory consider energy, material, human/staff, emission, and organization as optimization dimensions. These ultra-efficiency dimensions can be considered for analyzing not only the respective machine failure but also the effects on the entire production system holistically. This paper presents an easy to use method, based on a questionnaire, for assessing the failure consequences of a machine malfunction in a production system considering the ultra-efficiency dimensions. The method was validated in a battery production.
Using predictive maintenance, more efficient processes can be implemented, leading to fewer maintenance costs and increased availability. The development of a predictive maintenance solution currently requires high efforts in time and capacity as well as often interdisciplinary cooperation. This paper presents a standardized model to describe a predictive maintenance use case. The description model is used to collect, present, and document the required information for the implementation of predictive maintenance use cases by and for different stakeholders. Based on this model, predictive maintenance solutions can be introduced more efficiently. The method is validated across departments in the automotive sector.
The increasing complexity and need for availability of automated guided vehicles (AGVs) pose challenges to companies, leading to a focus on new maintenance strategies. In this paper, a smart maintenance architecture based on a digital twin is presented to optimize the technical and economic effectiveness of AGV maintenance activities. To realize this, a literature review was conducted to identify the necessary requirements for Smart Maintenance and Digital Twins. The identified requirements were combined into modules and then integrated into an architecture. The architecture was evaluated on a real AGV on the battery as one of the critical components.
Condition monitoring supported with artificial intelligence, cloud computing, and industrial internet of things (IIoT) technologies increases the feasibility of predictive maintenance. However, the cost of traditional sensors, data acquisition systems, and the required information technology expert-knowledge challenge the industry. This paper presents a hybrid condition monitoring system (CMS) architecture consisting of a distributed, low-cost IIoT-sensor solution. The CMS uses micro-electro-mechanical system (MEMS) microphones for data acquisition, edge computing for signal preprocessing, and cloud computing, including artificial neural networks (ANN) for higher-level information processing. The system's feasibility is validated using a testbed for reciprocating linear-motion axes.
Herein, biochar from biomass residues is demonstrated as active materials for the catalytic cracking of waste motor oil into diesel-like fuels. Above all, alkali-treated rice husk biochar showed great activity with a 250% increase in the kinetic constant compared to the thermal cracking. It also showed better activity than synthetic materials, as previously reported. Moreover, much lower activation energy (185.77 to 293.48 kJ/mol) for the cracking process was also obtained. According to materials characterization, the catalytic activity was more related to the nature of the biochar’s surface than its specific surface area. Finally, liquid products complied with all the physical properties defined by international standards for diesel-like fuels, with the presence of hydrocarbons chains between C10 - C27 similar to the ones obtained in commercial diesel.
Digital twins deployed in production are important in practice and interesting for research. Currently, mostly structured data coming from e.g., sensors and timestamps of related stations, are integrated into Digital Twins. However, semi- and unstructured data are also important to display the current status of a digital twin (e.g., of a machinery or produced good). Process Mining and Text Mining in combination can be used to support the use of log file data to understand the current state of the process as well as highlight issues. Therefore, issue related reactions can be taken more quickly, targeted and cost oriented. Applying a design science research approach; here a prototype as an artefact based on derived requirements is developed. This prototype helps to understand and to clarify the possibilities of Process Mining and Text Mining based on log data for production related Digital Twins. Contributions for practice and research are described. Furthermore, limitations of the research and future opportunities are pointed out.
This paper presents the first part of a research-work conducted at the University of Applied Sciences (HFT- Stuttgart). The aim of the research was to investigate the potential of low-cost renewable energy systems to reduce the energy demand of the building sector in hot and dry areas. Radiative cooling to the night sky represents a low-cost renewable energy source. The dry desert climate conditions promote radiative cooling applications. The system technology adopted in this work is based on uncovered solar thermal collectors integrated into the building’s hydronic system. By implementing different control strategies, the same system could be used for cooling as well as for heating applications. This paper focuses on identifying the collector parameters which are required as the coefficients to configure such an unglazed collector for calibrating its mathematical model within the simulation environment. The parameter identification process implies testing the collector for its thermal performance. This paper attempts to provide an insight into the dynamic testing of uncovered solar thermal collectors (absorbers), taking into account their prospective operation at nighttime for radiative cooling applications. In this study, the main parameters characterizing the performance of the absorbers for radiative cooling applications are identified and obtained from standardized testing protocol. For this aim, a number of plastic solar absorbers of different designs were tested on the outdoor test-stand facility at HFT-Stuttgart for the characterization of their thermal performance. The testing process was based on the quasi-dynamic test method of the international standard for solar thermal collectors EN ISO 9806. The test database was then used within a mathematical optimization tool (GenOpt) to determine the optimal parameter settings of each absorber under testing. Those performance parameters were significant to compare the thermal performance of the tested absorbers. The coefficients (identified parameters) were used then to plot the thermal efficiency curves of all absorbers, for both the heating and cooling modes of operation. Based on the intended main scope of the system utilization (heating or cooling), the tested absorbers could be benchmarked. Hence, one of those absorbers was selected to be used in the following simulation phase as was planned in the research-project.
In countries such as Germany, where municipalities have planning sovereignty, problems of urban sprawl often arise. As the dynamics of land development have not substantially subsided over the last years, the national government decided to test the instrument of ‘Tradable Planning Permits’ (TPP) in a nationwide field experiment with 87 municipalities involved. The field experiment was able to implement the key features of a TPP system in a laboratory setting with approximated real socioeconomic and planning conditions. In a TPP system allocated planning permits must be used by municipalities for developing land. The permits can be traded between local jurisdictions, so that they have flexibility in deciding how to comply with the regulation. In order to evaluate the performance of such a system, specific field data about future building areas and their impact on community budgets for the period 2014–2028 were collected. The field experiment contains several sessions with representatives of the municipalities and with students. The participants were confronted with two (municipalities) and four (students) schemes. The results show that a trading system can curb down land development in an effective and also efficient manner. However, depending on the regulatory framework, the trading schemes show different price developments and distributional effects. The unexperienced representatives of the local authorities can easily handle with the permits in the administration and in the established market. A trading scheme sets very high incentives to save open space and to direct development activities to areas within existing planning boundaries. It is therefore a promising instrument for Germany and also other regions or countries with an established land-use planning system.
We analyze economics PhDs’ collaborations in peer-reviewed journals from 1990 to 2014 and investigate such collaborations’ quality in relation to each co-author’s research quality, field and specialization. We find that a greater overlap between co-authors’ previous research fields is significantly related to a greater publication success of co-authors’ joint work and this is robust to alternative specifications. Co-authors that engage in a distant collaboration are significantly more likely to have a large research overlap, but this significance is lost when co-authors’ social networks are accounted for. High quality collaboration is more likely to emerge as a result of an interaction between specialists and generalists with overlapping fields of expertise. Regarding interactions across subfields of economics (interdisciplinarity), it is more likely conducted by co- authors who already have interdisciplinary portfolios, than by co-authors who are specialized or starred in different subfields.
COVID-19 and educational inequality: How school closures affect low- and high-achieving students
(2021)
In spring 2020, governments around the globe shut down schools to mitigate the spread of the novel coronavirus. We argue that low-achieving students may be particularly affected by the lack of educator support during school closures. We collect detailed time-use information on students before and during the school closures in a survey of 1099 parents in Germany. We find that while students on average reduced their daily learning time of 7.4 h by about half, the reduction was significantly larger for low-achievers (4.1 h) than for high-achievers (3.7 h). Low-achievers disproportionately replaced learning time with detrimental activities such as TV or computer games rather than with activities more conducive to child development. The learning gap was not compensated by parents or schools who provided less support for low-achieving students.
This paper presents a description model for smart, connected devices used in a manufacturing context. Similar to the wide spread adoption of smart products for personal and private usage, recent developments lead to a plethora of devices offering a variety of features and capabilities. Manufacturing companies undergoing digital transformation demand guidance with respect to the systematic introduction of smart, connected devices. The introduction of smart connected devices constitutes a strategic decision cost due to the high future committed cost after introduction and maintaining a smart device fleet by a vendor. This paper aims to support the introduction efforts by classifying the devices and thus helping companies identify their specific requirements for smart, connected devices before initiating widespread procurement. By mapping the features of these devices based on various attributes, allows the clustering of smart, connected devices including a requirement list for their implementation on the shopfloor. Four individual commercially available smart connected devices were analyzed using the description model.
Parallel grippers offer multiple applications thanks to their flexibility. Their application field ranges from aerospace and automotive to medicine and communication technologies. However, the application of grippers has the problem of exhibition wear and errors during the execution of their operation. This affects the performance of the gripper. In this context, the remaining useful life (RUL) defines the remaining lifespan until failure for an asset at a particular time of operation occurs. The exact lifespan of an asset is uncertain, thus the RUL model and estimation must be derived from available sources of information. This paper presents a method for the estimation of the RUL for a two-jaw parallel gripper. After the introduction to the topic, an overview of existing literature and RUL methods are presented. Subsequently, the method for estimating the RUL of grippers is explained. Finally, the results are summarized and discussed before the outlook and further challenges are presented.
The general conclusion of climate change studies is the necessity of eliminating net CO2 emissions in general and from the electric power systems in particular by 2050. The share of renewable energy is increasing worldwide, but due to the intermittent nature of wind and solar power, a lack of system flexibility is already hampering the further integration of renewable energy in some countries. In this study, we analyze if and how combinations of carbon pricing and power-to-gas (PtG) generation in the form of green power-to-hydrogen followed by methanation (which we refer to as PtG throughout) using captured CO2 emissions can provide transitions to deep decarbonization of energy systems. To this end, we focus on the economics of deep decarbonization of the European electricity system with the help of an energy system model. In different scenario analyses, we find that a CO2 price of 160 €/t (by 2050) is on its own not sufficient to decarbonize the electricity sector, but that a CO2 price path of 125 (by 2040) up to 160 €/t (by 2050), combined with PtG technologies, can lead to an economically feasible decarbonization of the European electricity system by 2050. These results are robust to higher than anticipated PtG costs.
With the digital transformation, companies will experience a change that focuses on shaping the organization into an agile organizational form. In today's competitive and fast-moving business environment, it is necessary to react quickly to changing market conditions. Agility represents a promising option for overcoming these challenges. The path to an agile organization represents a development process that requires consideration of countless levels of the enterprise. This paper examines the impact of digital transformation on agile working practices and the benefits that can be achieved through technology. To enable a solution for today's so-called VUCA (Volatility, Uncertainty, Complexity und Ambiguity) world, agile ways of working can be applied project management requires adaptation. In the qualitative study, expert interviews were conducted and analyzed using the grounded theory method. As a result, a model can be presented that shows the influencing factors and potentials of agile management in the context of the digital transformation of medium-sized companies.
Industrial practice is characterized by random events, also referred to as internal and external turbulences, which disturb the target-oriented planning and execution of production and logistics processes. Methods of probabilistic forecasting, in contrast to single value predictions, allow an estimation of the probability of various future outcomes of a random variable in the form of a probability density function instead of predicting the probability of a specific single outcome. Probabilistic forecasting methods, which are embedded into the analytics process to gain insights for the future based on historical data, therefore offer great potential for incorporating uncertainty into planning and control in industrial environments. In order to familiarize students with these potentials, a training module on the application of probabilistic forecasting methods in production and intralogistics was developed in the learning factory 'Werk150' of the ESB Business School (Reutlingen University). The theoretical introduction to the topic of analytics, probabilistic forecasting methods and the transition to the application domain of intralogistics is done based on examples from other disciplines such as weather forecasting and energy consumption forecasting. In addition, data sets of the learning factory are used to familiarize the students with the steps of the analytics process in a practice-oriented manner. After this, the students are given the task of identifying the influencing factors and required information to capture intralogistics turbulences based on defined turbulence scenarios (e.g. failure of a logistical resource) in the learning factory. Within practical production scenario runs, the students apply probabilistic forecasting using and comparing different probabilistic forecasting methods. The graduate training module allows the students to experience the potentials of using probabilistic forecasting methods to improve production and intralogistics processes in context with turbulences and to build up corresponding professional and methodological competencies.
Especially, if the potential of technical and organizational measures for ergonomic workplace design is limited, exoskeletons can be considered as innovative ergonomic aids to reduce the physical workload of workers. Recent scientific findings from ergonomic analyses with and without exoskeletons are indicating that strain reduction can be achieved, particularly at workplaces with lifting, holding, and carrying processes. Currently, a work system design method is under development incorporating criteria and characteristics for the design of work systems in which a human worker is supported by an exoskeleton. Based on the properties of common passive and active exoskeletons, factors influencing the human on which an exoskeleton can have a positive or negative effect (e.g. additional weight) were derived. The method will be validated by the conceptualization and setup of several work system demonstrators at Werk150, the factory of ESB Business School on campus of Reutlingen University, to prove the positive ergonomic effect on humans and the supporting process to choose the suitable exoskeleton. The developed method and demonstrators enable the user to experience the positive ergonomic effects of exoskeletal support in lifting, holding and carrying processes in logistics and production. The new work system design method will contribute to the fact that employees can pursue their professional activity longer without substantial injuries or can be used more flexibly at different work stations. Also new work concepts, strategies and scenarios are opened up to reduce the risk of occupational accidents and to promote the compatibility of work for employees. A training module is being developed and evaluated with participants from industry and master students to build up competence.
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
The use of deep learning models with medical data is becoming more widespread. However, although numerous models have shown high accuracy in medical-related tasks, such as medical image recognition (e.g. radiographs), there are still many problems with seeing these models operating in a real healthcare environment. This article presents a series of basic requirements that must be taken into account when developing deep learning models for biomedical time series classification tasks, with the aim of facilitating the subsequent production of the models in healthcare. These requirements range from the correct collection of data, to the existing techniques for a correct explanation of the results obtained by the models. This is due to the fact that one of the main reasons why the use of deep learning models is not more widespread in healthcare settings is their lack of clarity when it comes to explaining decision making.
Nowadays, the importance of early active patient mobilization in the recovery and rehabilitation phase has increased significantly. One way to involve patients in the treatment is a gamification-like approach, which is one of the methods of motivation in various life processes. This article shows a system prototype for patients who require physical activity because of active early mobilization after medical interventions or during illness. Bedridden patients and people with a sedentary lifestyle (predominantly lying in bed) are also potential users. The main idea for the concept was non-contact system implementation for the patients making them feel effortless during its usage. The system consists of three related parts: hardware, software, and game application. To test the relevance and coherence of the system, it was used by 35 people. The participants were asked to play a video game requiring them to make body movements while lying down. Then they were asked to take part in a small survey to evaluate the system's usability. As a result, we offer a prototype consisting of hardware and software parts that can increase and diversify physical activity during active early mobilization of patients and prevent the occurrence of possible health problems due to predominantly low activity. The proposed design can be possibly implemented in hospitals, rehabilitation centers, and even at home.
Healthy sleep is required for sufficient restoration of the human body and brain. Therefore, in the case of sleep disorders, appropriate therapy should be applied timely, which requires a prompt diagnosis. Traditionally, a sleep diary is a part of diagnosis and therapy monitoring for some sleep disorders, such as cognitive behaviour therapy for insomnia. To automatise sleep monitoring and make it more comfortable for users, substituting a sleep diary with a smartwatch measurement could be considered. With the aim of providing accurate results, a study with a total of 30 night recordings was conducted. Objective sleep measurement with a Samsung Galaxy Watch 4 was compared with a subjective approach (sleep diary), evaluating the four relevant sleep characteristics: time of getting asleep, wake up time, sleep efficiency (SE), and total sleep time (TST). The performed analysis has demonstrated that the median difference between both measurement approaches was equal to 7 and 3 minutes for a time of getting asleep and wake up time correspondingly, which allows substituting a subjective measurement with a smartwatch. The SE was determined with a median difference between the two measurement methods of 5.22%. This result also implicates a possibility of substitution. Some single recordings have indicated a higher variance between the two approaches. Therefore, the conclusion can be made that a substitution provides reliable results primarily in the case of long-term monitoring. The results of the evaluation of the TST measurement do not allow to recommend substitution of the measurement method.
Home health applications have evolved over the last few decades. Assistive systems such as a data platform in connection with health devices can allow for health-related data to be automatically transmitted to a database. However, there remain significant challenges concerning intermodular communication. Central among them is the challenge of achieving interoperability, the ability of devices to communicate and share data with each other. A major goal of this project was to extend an existing data platform (COMES®) and establish working interoperability by connecting assistive devices with differing approaches. We describe this process for a sleep monitoring and a physical exercise device. Furthermore, we aimed to test this setup and the implementation with a data platform in both a laboratory and an in-home setting with 11 elderly participants. The platform modification was realized, and the relevant changes were made so that the incoming data could be processed by the data platform, as well as visually displayed in real-time. Data was recorded by the respective device and transmitted into the data server with minor disruptions. Our observations affirmed that difficulties and data loss are far more likely to occur with increasing technical complexity, in the event of instable internet connection, or when the device setup requires (elderly) subjects to take specific steps for proper functioning. We emphasize the importance for tests and evaluations of home health technologies in real-life circumstances.
The citizen-centered health platform project is intended to provide a platform that can be used in EU cross-border regions, where social and economic exchange occurs across national borders. The overriding challenges are: (a) social: improving citizen-centered health and care provision; (b) technical: providing a digital platform for networking citizens, service providers, and municipal actors; (c) economic: developing long-term successful (sustainable) business models/value chains. The platform should strengthen and expand existing networks and establish new regional networks. Each network addresses particular challenges and apply them in a region-specific manner. Here, the national boundary conditions and the interregional needs play an essential role. These objectives require sufficient participation of civil society representatives. Furthermore, the platform will establish an overarching, sustainable, and knowledge-based network of health experts. The platform is to be jointly developed and implemented in the regions and follow an open-access approach. Therefore, synergies will be shared more quickly, strengthening competencies and competitiveness. In addition to practice partners, scientific and municipal institutions and SMEs are involved. The actors thus contribute to scientific performance, innovative strength, and resilience.
Modern production systems are characterized by the increasingly use of CPS and IoT networks. However, processing the available information for adaptation and reconfiguration often occurs in relatively large time cycles. It thus does not take advantage of the optimization potential available in the short term. In this paper, a concept is presented that, considering the process information of the individual heterogeneous system elements, detects optimization potentials and performs or proposes adaptation or reconfiguration. The concept is evaluated utilizing a case study in a learning factory. The resulting system thus enables better exploitation of the potentials of the CPPS.
Energy Communities explores core potential systemic benefits and costs in engaging consumers into communities, particularly relating to energy transition. The book evaluates the conditions under which energy communities might be regarded as customer-centered, market-driven and welfare-enhancing. The book also reviews the issue of prevalence and sustainability of energy communities and whether these features are likely to change as opportunities for distributed energy grow. Sections cover the identification of welfare considerations for citizens and for society on a local and national level, and from social, economic and ecological perspectives, while also considering different community designs and evolving business models.
The global demand for resources such as energy, land, or water is constantly increasing. It is therefore not sur- prising that research on the Food-Energy-Water (FEW) nexus has become a scientific as well as a general focus in recent years. A significant increase in publications since 2015 can be observed, and it can be expected that this trend will continue. A multilevel (macro, meso, and micro) perspective is essential, as the FEW nexus has cross- sectoral interdependencies. Several review studies on the FEW nexus can be found in the literature, in general, it can be concluded that the FEW nexus is a multi-disciplinary and complex topic. The studies examined identify essential fields of action for research, policy, and society. However, questions such as what are the main research fields at each level? Is it possible to divide the research into specific clusters? and do the clusters correlate with the levels, and what are the methods of modeling used in the clusters and levels? are still not fully discussed in the literature. An extensive literature review was conducted to get insight into the existing research areas. Especially in such fields as the FEW nexus, the amount of literature can get huge, and a human could get lost analyzing the literature manually. For that, we created word clouds and performed a cluster- and network-analysis to support the selection of most relevant papers for a detailed reading. In 2021, the most publications were published, with 173 publications, which corresponds to a share of 26.6 %. There has been a significant increase since 2015, and it can be expected that this trend will continue in the coming years. Most of the first authors come from the USA (25.4 %), followed by China with 22.4 %. From the word cloud and the top 20 words, which appear in the title and abstract, it can be deduced that the topic water is the most represented. However, the terms system, resource, model, study, change, development, and management also appear to be very important, which indi- cates the importance of a holistic approach to the topic. In total 9 clusters could be identified at the different levels. It can be seen that three clusters form well. For the others, a rather diffuse picture can be observed. In order to find out which topics are hidden behind the individual clusters, 6 publications from each cluster were subjected to a more detailed examination. With these steps, a number of 54 publications were identified for de- tailed consideration. The modeling approaches that are currently being applied in research can be classified into domain-specific tools (e. g. global water models, crop models or global climate models) and into more general tools to perform for example a life cycle analysis, spatial analysis using geographic information system, or system dynamics for a general understanding of the links between the domains. With the domain-specific tools, detailed research questions can be addressed to answer questions for a specific domain. However, these tools have the disadvantage that especially the links between the sectors food, energy, and water are not fully considered. Many implementations that are made today are at lowest level (micro) relate to bounded spatial areas and are derived from macro and meso level goals.
The transmembrane Ca2+ − activated Cl− channel - human bestrophin-1 (hBest1) is expressed in retinal pigment epithelium and mutations of BEST1 gene cause ocular degenerative diseases colectivelly referred to as “bestrophinopathies”. A large number of genetical, biochemical, biophysical and molecular biological studies have been performed to understand the relationship between structure and function of the hBest1 protein and its pathophysiological significance. Here, we review the current understanding of hBest1 surface organization, interactions with membrane lipids in model membranes, and its association with microdomains of cellular membranes. These highlights are significant for modulation of channel activity in cells.
Blockchain is a technology for the secure processing and verification of data transactions based on a distributed peer-to-peer network that uses cryptographic processes, consensus algorithms, and backward-linked blocks to make transactions virtually immutable. Within supply chain management, blockchain technology offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. However, its complexity requires future employees to have comprehensive knowledge regarding the functionality of blockchain-based applications in order to be able to apply their benefits to scenarios in supply chain and production. Learning factories represent a suitable environment allowing learners to experience new technologies and to apply them to virtual and physical processes throughout value chains. This paper presents a concept to practically transfer knowledge about the technical functionality of blockchain technology to future engineers and software developers working within supply chains and production operations to sensitize them regarding the advantages of decentralized applications. First, the concept proposes methods to playfully convey immutable backward-linked blocks and the embedment of blockchain smart contracts. Subsequently, the students use this knowledge to develop blockchain-based application scenarios by means of an exemplary product in a learning factory environment. Finally, the developed solutions are implemented with the help of a prototypical decentralized application, which enables a holistic mapping of supply chain events.
The fifth mobile communications generation (5G) offers the deployment scenario of licensed 5G standalone non-public networks (NPNs). Standalone NPNs are locally restricted 5G networks based on 5G New Radio technology which are fully isolated from public networks. NPNs operate on their dedicated core network and offer organizations high data security and customizability for intrinsic network control. Especially in networked and cloud manufacturing, 5G is seen as a promising enabler for delay-sensitive applications such as autonomous mobile robots and robot motion control based on the tactile internet that requires wireless communication with deterministic traffic and strict cycling times. However, currently available industrial standalone NPNs do not meet the performance parameters defined in the 5G specification and standardization process. Current research lacks in performance measurements of download, upload, and time delays of 5G standalone-capable end-devices in NPNs with currently available software and hardware in industrial settings. Therefore, this paper presents initial measurements of the data rate and the round-trip delay in standalone NPNs with various end-devices to generate a first performance benchmark for 5G-based applications. In addition, five end-devices are compared to gain insights into the performance of currently available standalone-capable 5G chipsets. To validate the data rate, three locally hosted measurement methods, namely iPerf3, LibreSpeed and OpenSpeedTest, are used. Locally hosted Ping and LibreSpeed have been executed to validate the time delay. The 5G standalone NPN of Reutlingen University uses licensed frequencies between 3.7-3.8 GHz and serves as the testbed for this study.
Towards a model for holistic mapping of supply chains by means of tracking and tracing technologies
(2022)
The usage of tracking and tracing technologies not only enables transparency and visibility of supply chains but also offers far-reaching advantages for companies, such as ensuring product quality or reducing supplier risks. Increasing the amount of shared information supports both internal and external planning processes as well as the stability and resilience of globally operating value chains. This paper aims to differentiate and define the functionalities of tracking and tracing technologies that are frequently used interchangeably in literature. Furthermore, this paper incorporates influencing factors impacting a sequencing of the connected world in Industry4.0 supply chain networks. This includes legal influences, the embedment of supply chain-related standards, and new possibilities of emerging technologies. Finally, the results are summarized in a model for the holistic mapping of supply chains by means of tracking and tracing technologies. The resulting technological solutions that can be derived from the model enable companies to address missing elements in order to enable the holistic mapping of supply chain events as well as the transparent representation of a digital shadow throughout the entire supply chain.
The proper selection of a demand forecasting method is directly linked to the success of supply chain management (SCM). However, today’s manufacturing companies are confronted with uncertain and dynamic markets. Consequently, classical statistical methods are not always appropriate for accurate and reliable forecasting. Algorithms of Artificial intelligence (AI) are currently used to improve statistical methods. Existing literature only gives a very general overview of the AI methods used in combination with demand forecasting. This paper provides an analysis of the AI methods published in the last five years (2017-2021). Furthermore, a classification is presented by clustering the AI methods in order to define the trend of the methods applied. Finally, a classification of the different AI methods according to the dimensionality of data, volume of data, and time horizon of the forecast is presented. The goal is to support the selection of the appropriate AI method to optimize demand forecasting.
Artificial intelligence is a field of research that is seen as a means of realization regarding digitalization and industry 4.0. It is considered as the critical technology needed to drive the future evolution of manufacturing systems. At the same time, autonomous guided vehicles (AGV) developed as an essential part due to the flexibility they contribute to the whole manufacturing process within manufacturing systems. However, there are still open challenges in the intelligent control of these vehicles on the factory floor. Especially when considering dynamic environments where resources should be controlled in such a way, that they can be adjusted to turbulences efficiently. Therefore, this paper aimed to develop a conceptual framework for addressing a catalog of criteria that considers several machine learning algorithms to find the optimal algorithm for the intelligent control of AGVs. By applying the developed framework, an algorithm is automatically selected that is most suitable for the current operation of the AGV in order to enable efficient control within the factory environment. In future work, this decision-making framework can be transferred to even more scenarios with multiple AGV systems, including internal communication along with AGV fleets. With this study, the automatic selection of the optimal machine learning algorithm for the AGV improves the performance in such a way, that computational power is distributed within a hybrid system linking the AGV and cloud storage in an efficient manner.
The functionality of existing cyber-physical production systems generally focuses on mapping technologic specifications derived from production requirements. Consequently, such systems base their conception on a structurally mechanistic paradigm. Insofar as these approaches have considered humans, their conception likewise is based on the structurally identical paradigm. Due to the fundamental reorientation towards explicitly human-centered approaches, the fact that essential aspects of the dimension "human" remain unconsidered by the previous paradigm becomes more and more apparent. To overcome such limitations, mapping the "social" dimension requires a structurally different approach. In this paper, an anthropocentric approach is developed based on possible conceptions of the human being, enabling a structural integration of the human being in an extended dimension. Through the model, extending concepts for better integration of the human being in the sense of human-centered approaches, as envisioned in the Industrie 5.0 conception, is possible.
The paradigmatic shift of production systems towards Cyber-Physical Production Systems (CPPSs) requires the development of flexible and decentralized approaches. In this way, such systems enable manufacturers to respond quickly and accurately to changing requirements. However, domain-specific applications require the use of suitable conceptualizations. The issue at hand, when using various conceptualizations is the interoperability of different ontologies. To achieve flexibility and adaptability in CPPSs though requires overcoming interoperability issues within CPPSs. This paper presents an approach to increase flexibility and adaptability in CPPSs while addressing the interoperability issue. In this work, OWL ontologies conceptualize domain knowledge. The Intelligent Manufacturing Knowledge Ontology Repository (IMKOR) connects the domain knowledge in different ontologies. Testing if adaptions in one ontology within the IMKOR provide knowledge to the whole IMKOR. The tests showed, positive results and the repository makes the knowledge available to the whole CPPS. Furthermore, an increase in flexibility and adaptability was noticed.
Adoption of artificial intelligence (AI) has risen sharply in recent years but many firms are not successful in realising the expected benefits or even terminate projects before completion. While there are a number of previous studies that highlight challenges in AI projects, critical factors that lead to project failure are mostly unknown. The aim of this study is therefore to identify distinct factors that are critical for failure of AI projects. To address this, interviews with experts in the field of AI from different industries are conducted and the results are analyzed using qualitative analysis methods. The results show that both, organizational and technological issues can cause project failure. Our study contributes to knowledge by reviewing previously identified challenges in terms of their criticality for project failure based on new empirical data, as well as, by identifying previously unknown factors.
A closed-loop control for a cooperative innovation culture in interorganizational R&D projects
(2022)
Since project managers only have a limited authority in interorganizational R&D projects a cooperative innovation culture is essential for team cohesion and thus for achieving project scope in time and cost. For its development different factors depending on underlying values are essential. These factors must be learned iteratively by the project members so that they are living the values of a cooperative innovation culture. Hence, this paper raises the following research question: “How to control living the values of a cooperative innovation culture in interorganizational R&D projects?” To answer this question, a closed-loop control for a cooperative innovation culture is developed. The developed closed-loop control system includes several different functional units which show essential roles and several different variables which show what to consider and design in the control system. In addition, the developed closed-loop control system is generalized for other types of projects such as intraorganizational projects.
Allyls
(2022)
This chapter addresses the importance and usage of the commercially low-volume thermoset plastics group known as allyls. The three significant subelements of this group are poly(diallylphthalates), poly(diallylisophthalates), and poly(allyldiglycol carbonate). Chemistry, processing, and properties are also described. Allyl polymers are synthesized by radical polymerizations of allyl monomers that usually do not produce high-molecular-mass macromolecules. Therefore only a few specific monomers can produce thermosetting materials. Diallyldiglycolcarbonate (CR-39) and diallylphthalates are the most significant examples that have considerably improved our everyday life.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries.
This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Cross-linked thermoplastics
(2022)
Cross-linked thermoplastics represent an important class of materials for numerous applications such as heat-shrinkable tubing, rotational molded parts, and polyolefin foams. By cross-linking olefins, their mechanical performance can be significantly enhanced. This chapter covers the three main methods for the cross-linking of thermoplastics: radiation cross-linking, chemical cross-linking with organic peroxides, and cross-linking using silane-grafting agents. It also considers the major effects of the cross-linking procedure on the performance of the thermoplastic materials discussed.
Silicones
(2022)
Silicones are found in a variety of applications with requirements that range from long life at elevated temperatures to fluidity at low temperatures. This chapter first considers silicone elastomers and their application in room temperature vulcanizing (RTV) and heat curing systems (HTV). Also, new technologies for UV curing are introduced. Coverage of RTVs includes both one-component and two-component systems and the different cure chemistries of each and is followed by a separate discussion of silicone laminates. Due to the high importance of silicone fluids, they are also discussed. Fluids include polishes, release agents, surfactants, and dielectric fluids.
Cyanate ester resins
(2022)
Cyanate ester resins are an important class of thermosetting compounds that experience an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other application fields are especially suitable for highly demanding applications in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the –OCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure.
Process analysis and process control have attracted increasing interest in recent years. The development and application of process analytical methods are a prerequisite for the knowledge-based manufacturing of industrial goods and allow for the production of high-value products of defined, constantly good quality. Discussed in this chapter are the measurement principle and some relevant aspects and illustrative examples of online monitoring tools as the basis for process control in the manufacturing and processing of thermosetting resins. Optical spectroscopy is featured as one of the main process analytical methods applicable to, among other applications, online monitoring of resin synthesis. In combination with chemometric methods for multivariate data analysis, powerful process models can be generated within the framework of feedback and feed-forward control concepts. Other analytical methods covered in this chapter are those frequently used to control further processing of thermosets to the final parts, including dielectric analysis, ultrasonics, fiber optics, and Fiber Bragg Grating sensors.
Self-healing thermosets
(2022)
This chapter discusses the basic extrinsic, intrinsic, and combined extrinsic/intrinsic strategies for equipping thermosetting polymers with self-healing properties. The main focus will be on the presentation of a holistic optimization of thermosetting materials, that is, on a simultaneous optimization of both self-healing and other, specialized material properties. Due to their very rigid, highly cross-linked three-dimensional structure, thermosetting polymers require special chemical strategies to achieve self-healing properties. The main chemical strategies available for this will be briefly outlined. The examples given illustrate interesting and/or typical procedures and serve as an inspiration to find solutions for your own applications. They summarize important recent development in research and technology aiming toward multifunctional truly smart self-healing thermosetting materials. An important aspect in this topic area is also how precisely the self-healing effects are analytically checked, quantified, and evaluated. A range of measuring methods is available for this purpose. In this chapter, the most important analytical tools for testing self-healing properties are briefly introduced and highlighted with some illustrative examples.
Today, many industrial tasks are not automated and still require human intervention. One of these tasks is the unloading of oversea containers. After the end of transportation to the sorting center, the containers must be unloaded manually for further sending the parcels to the recipients. A robot-based automatic unloading of containers was therefore researched. However, the promising results of the system developed in these projects could not be commercialized due to problems with its reliability. Mechanical, algorithmic or other limitations are possible causes of the observed errors. To analyze errors, it is necessary to evaluate the results of the robot’s work without complicating the existing system by adding new sensors to it. This paper presents a reference system based on machine learning to evaluate the robotics grasps of parcels. It analyzes two states of the container: before and after picking up one box. The states are represented as a point cloud received from a laser scanner. The proposed system evaluates the success of transferring a box from an overseas container to the sorting line by supervised learning using convolutional neural networks (CNN) and manual labeling of the data. The process of obtaining a working model using a hyperband model search with a maximum classification error of 3.9 % is also described.
Theory and practice of implementing a successful enterprise IoT strategy in the industry 4.0 era
(2021)
Since the arrival of the internet and affordable access to technologies, digital technologies have occupied a growing place in industries, propelling us towards a 4th industrial revolution: Industry 4.0. In today’s era of digital upheaval, enterprises are increasingly undergoing transformations that are leading to their digitalization. The traditional manufacturing industry is in the throes of a digital transformation that is accelerated by exponentially growing technologies (e.g., intelligent robots, Internet of Things, sensors, 3D printing). Around the world, enterprises are in a frantic race to implement solutions based on IoT to improve their productivity, innovation, and reduce costs and improve their markets on the international scene. Considering the immense transformative potential that IoTs and big data have to bring to the industrial sector, the adoption of IoT in all industrial systems is a challenge to remain competitive and thus transform the industry into a smart factory. This paper presents the description of the innovation and digitalization process, following the Industry 4.0 paradigm to implement a successful enterprise IoT strategy.
Conventional production systems are evolving through cyber-physical systems and application-oriented approaches of AI, more and more into "smart" production systems, which are characterized among other things by a high level of communication and integration of the individual components. The exchange of information between the systems is usually only oriented towards the data content, where semantics is usually only implicitly considered. The adaptability required by external and internal influences requires the integration of new or the redesign of existing components. Through an open application-oriented ontology the information and communication exchange are extended by explicit semantic information. This enables a better integration of new and an easier reconfiguration of existing components. The developed ontology, the derived application and use of the semantic information will be evaluated by means of a practical use case.
Business opportunities for energy providers to utilize flexible industrial demand are platform-based, connecting small and medium-sized enterprises (SMEs) to a virtual power plant (VPP) in complex ecosystems. Unlike in other VPPs, the focus is on participation, data, and control sovereignty for the SMEs. An exemplary application for an existing cement mill demonstrates positive margins. Viable VPP business models for small and medium-sized utilities include the “orchestrator,” i.e., adding value by linking services of specialized providers, the “integrator,” i.e., incorporating internal and external processes and resources, as well as the “white label user,” i.e., using a turn-key VPP from an exclusive cooperation partner.
Maintenance is an increasingly complex and knowledge-intensive field. In order to address these challenges, assistance systems based on augmented, mixed, or virtual reality can be applied. Therefore, the objective of this paper is to present a framework that can be used to identify, select, and implement an assistance system based on reality technology in the maintenance environment. The development of the framework is based on a systematic literature review and subject matter expert interviews. The framework provides the best technological and economic solution in several steps. The validation of the framework is carried out through a case study.
This paper presents the concept of the system architecture of a flexible cyber-physical factory control system. The system allows the automation of process structures using cyber-physical fractal nodes. These nodes have a functional and independent form and can be clustered to larger structures. This makes it possible to equip the factory with a flexible, freely scalable, modular system. The description of this system architecture and the associated rules and conditions is outlined in the concept.
Increasing complexity in manufacturing processes poses new challenges for industrial maintenance. In addition, advanced machine monitoring and lifetime forecasting options expand the tools and maintenance strategies available. Today, maintenance strategy selection is performed sequentially usually based on prioritised machines and components. These selections are optimized locally for each machine isolated, not considering the context of other machines within the value-adding network. To overcome these challenges, this paper presents an approach for an integrated maintenance strategy selection in one-step by an integrated model considering possible machine failures and the context of other machines within the value-adding network in parallel.
Railway operators are being challenged by increasing complexity and safeguarding the availability of passenger rolling stock, bringing maintenance and especially emerging technologies into the focus. This paper presents a model for selection and implementation of Industry 4.0 technologies in rolling stock maintenance. The model consists of different stages and considers the main components of rolling stock, the related appropriate maintenance strategies and Industry 4.0 technologies considering the maturity level of the railway operators. Relevant criteria and main prerequisites of the technologies were identified. The model proposes relevant activities and was validated by industry experts.
The technologies of digital transformation, such as the Internet-of-Things (IoT), artificial intelligence or predictive maintenance enable significant efficiency gains in industry and are becoming increasingly important as a competitive factor. However, their successful implementation and creative, future application requires the broad acceptance and knowledge of non-IT-related groups, such as production management students, engineers or skilled workers, which is still lacking today. This paper presents a low-threshold training concept bringing IoT-technologies and applications into manufacturing related higher education and employee training. The concept addresses the relevant topics starting from IoT-basics to predictive maintenance using mobile low-cost hardware and infrastructure.
The maintenance of railway infrastructure remains a challenge. Data acquisition technologies have evolved because of Industry 4.0, expanding the capabilities of predictive maintenance. Despite the advances, the potential of these emerging technologies has not been fully realised. This paper presents a technology selection framework in support of railway infrastructure predictive maintenance, which is based on qualitative methods. It consists of three stages, including the mapping of the infrastructure characteristics with the identified technologies, the evaluation of the most appropriate technologies, and the sourcing thereof. This presents the collective decision support output of the framework.
Today's logistics systems are characterized by uncertainty and constantly changing requirements. Rising demand for customized products, short product life cycles and a large number of variants increases the complexity of these systems enormously. In particular, intralogistics material flow systems must be able to adapt to changing conditions at short notice, with little effort and at low cost. To fulfil these requirements, the material flow system needs to be flexible in three important parameters, namely layout, throughput and product. While the scope of the flexibility parameters is described in literature, the respective effects on an intralogistics material flow system and the influencing factors are mostly unknown. This paper describes how flexibility parameters of an intralogistics system can be determined using a multi-method simulation. The study was conducted in the learning factory “Werk150” on the campus of Reutlingen University with its different means of transport and processes and validated in terms of practical experiments.
The production environment experiences copious challenges, but likewise discovers many new potential opportunities. To meet the new requirements, caused by the developments towards mass-customization, human-robot-cooperation (HRC) was identified as a key piece of technology and is becoming more and more important. HRC combines the strengths of robots, such as reliability, endurance and repeatability, with the strengths of humans, for instance flexibility and decision-making skills. Notwithstanding the high potential of HRC applications, the technology has not achieved a breakthrough in production so far. Studies have shown that one of the biggest obstacles for implementing HRC is the allocation of tasks. Another key technology that offers various opportunities to improve the production environment is Artificial Intelligence (AI). Therefore, this paper describes an AI supported method to improve the work organization in HRC in regards to the task-allocation. The aim of this method is to build a dynamic, semi-autonomous group work environment which keeps not just employee motivation at a high level, but also the product quality due to a decreased failure rate. The AI helps to detect the perfect condition in which the employee delivers the best performance and also supports at identifying the time when the worker leaves this optimal state. As soon as the employee reaches this trigger event, the allocation of the tasks adapts based on the identified stress. This adaptation aims to return the employee to the state of the optimal performance. In order to realize such a dynamic allocation, this method describes the creation of a pool with various interaction scenarios, as well as the AI supported recognition of the defined trigger event.
Manufacturing companies are confronted with external (e.g. short-term change of product configuration by the customer) and internal (e.g. production process deviations) turbulences which are affecting the performance of production. Predefined, centrally controlled logistics processes are limiting the possibilities of production to initiate countermeasures to react in an optimized way to these turbulences. The autonomous control of intralogistics offers a great potential to cope with these turbulences by using the respective flexibility corridors of production systems and applying intelligent logistic objects with decentralized decision and process execution capabilities to maintain a target-optimized production. A method for AI-based storage-location- and material-handling-optimization to achieve performance-optimized intralogistics system through continuous monitoring of performance-relevant parameters and influencing factors by using AI (e.g. for pattern recognition) has been developed. To provide the basis to investigate and demonstrate the potentials of autonomously controlled intralogistics in connection with turbulences of production and in combination with AI, an intelligent warehouse involving an indoor localization system, smart bins, manual, semi-automated/collaborative and autonomous transport systems has been developed and implemented at Werk150, the factory on campus of ESB Business School (Reutlingen University). This scenario, which has been integrated into graduate training modules, allows the analysis and demonstration of different measures of intralogistics to cope with turbulences in production involving amongst others storage and material provision processes. The target fulfilment of the applied intralogistics measures to master arising turbulences is assessed based on the overall performance of production considering lead times and adherence to delivery dates. By applying artificial intelligence (AI) algorithms the intelligent logistical objects (smart bin, transport systems, etc.) as well as the entire logistics system should be enabled to improve their decision and process execution capabilities to master short-term turbulences in the production system autonomously.
Teaching at assembly workstations in production in SMEs (small and medium sized companies) often does not take place at all or only insufficiently. In addition to the lack of technical content, there are also aggravatingly incorrect movement sequences from an ergonomic point of view, which "untrained" people usually automatically acquire. An AI based approach is used to analyze a definite workflow for a specific assembly scope regarding the behavior of several employees. Based on these different behaviors, the AI gives feedback at which points in time, work steps and movement’s particularly dangerous incorrect postures occur. Motion capturing and digital human model simulation in combination with the results of the AI define the optimized workflow. Individual employees can be trained directly due to the fact that AI identifies their most serious incorrect postures and provide them with a direct analogy of their “wrong” posture and “easy on the joints posture”. With the assistance of various test persons, the AI can conduct a study in which the most frequently occurring incorrect postures can be identified. This could be realized in general or tailored to specific groups of people (e.g. "People over 1.90m tall must be particularly careful not to make the following mistake...). The approach will be tested and validated at the Werk150, the factory of the ESB Business School, on the campus of the Reutlingen University. The new gained knowledge will be used subsequently for training in SMEs.
Human bestrophin-1 (hBest1) is a transmembrane Ca2+- dependent anion channel, associated with the transport of Cl−, HCO3- ions, γ-aminobutiric acid (GABA), glutamate (Glu), and regulation of retinal homeostasis. Its mutant forms cause retinal degenerative diseases, defined as Bestrophinopathies. Using both physicochemical - surface pressure/mean molecular area (π/A) isotherms, hysteresis, compressibility moduli of hBest1/sphingomyelin (SM) monolayers, Brewster angle microscopy (BAM) studies, and biological approaches - detergent membrane fractionation, Laurdan (6-dodecanoyl-N,N-dimethyl-2-naphthylamine) and immunofluorescence staining of stably transfected MDCK-hBest1 and MDCK II cells, we report:
1) Ca2+, Glu and GABA interact with binary hBest1/SM monolayers at 35 °C, resulting in changes in hBest1 surface conformation, structure, self-organization and surface dynamics. The process of mixing in hBest1/SM monolayers is spontaneous and the effect of protein on binary films was defined as “fluidizing”, hindering the phase-transition of monolayer from liquid-expanded to intermediate (LE-M) state;
2) in stably transfected MDCK-hBest1 cells, bestrophin-1 was distributed between detergent resistant (DRM) and detergent-soluble membranes (DSM) - up to 30 % and 70 %, respectively; in alive cells, hBest1 was visualized in both liquid-ordered (Lo) and liquid-disordered (Ld) fractions, quantifying protein association up to 35 % and 65 % with Lo and Ld. Our results indicate that the spontaneous miscibility of hBest1 and SM is a prerequisite to diverse protein interactions with membrane domains, different structural conformations and biological functions.
Employing diffuse reflection ultraviolet visible (UV–Vis) spectroscopy we developed an approach that is capable to quantitatively determine flux residues on a technical copper surface. The technical copper surface was soldered with a no-clean flux system of organic acids. By a post-solder cleaning step with different cleaning parameters, various levels of residues were produced. The surface was quantitatively and qualitatively characterized using X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES), Fourier transform infrared spectroscopy (FTIR) and diffuse reflection UV–Vis spectroscopy. With the use of a multivariate analysis (MVA) we examined the UV–Vis data to create a correlation to the carbon content on the surface. The UV–Vis data could be discriminated for all groups by their level of organic residues. Combined with XPS the data were evaluated by a partial least squares (PLS) regression to establish a model. Based on this predictive model, the carbon content was calculated with an absolute error of 2.7 at.%. Due to the high correlation of predictive model, the easy-to-use measurement and the evaluation by multivariate analysis the developed method seems suitable for an online monitoring system. With this system, flux residues can be detected in a manufacturing cleaning process of technical surfaces after soldering.
Respiratory diseases are leading causes of death and disability in the world. The recent COVID-19 pandemic is also affecting the respiratory system. Detecting and diagnosing respiratory diseases requires both medical professionals and the clinical environment. Most of the techniques used up to date were also invasive or expensive.
Some research groups are developing hardware devices and techniques to make possible a non-invasive or even remote respiratory sound acquisition. These sounds are then processed and analysed for clinical, scientific, or educational purposes.
We present the literature review of non-invasive sound acquisition devices and techniques.
The results are about a huge number of digital tools, like microphones, wearables, or Internet of Thing devices, that can be used in this scope.
Some interesting applications have been found. Some devices make easier the sound acquisition in a clinic environment, but others make possible daily monitoring outside that ambient. We aim to use some of these devices and include the non-invasive recorded respiratory sounds in a Digital Twin system for personalized health.
Normal breathing during sleep is essential for people’s health and well-being. Therefore, it is crucial to diagnose apnoea events at an early stage and apply appropriate therapy. Detection of sleep apnoea is a central goal of the system design described in this article. To develop a correctly functioning system, it is first necessary to define the requirements outlined in this manuscript clearly. Furthermore, the selection of appropriate technology for the measurement of respiration is of great importance. Therefore, after performing initial literature research, we have analysed in detail three different methods and made a selection of a proper one according to determined requirements. After considering all the advantages and disadvantages of the three approaches, we decided to use the impedance measurement-based one. As a next step, an initial conceptual design of the algorithm for detecting apnoea events was created. As a result, we developed an activity diagram on which the main system components and data flows are visually represented.
Preliminary results of homomorphic deconvolution application to surface EMG signals during walking
(2021)
Homomorphic deconvolution is applied to sEMG signals recorded during walking. Gastrocnemius lateralis and tibialis anterior signals were acquired according to SENIAM recommendation. MUAP parameters like amplitude and scale were estimated, whilst the MUAP shape parameter was fixed. This features a useful time-frequency representation of sEMG signal. Estimation of scale MUAP parameter was verified extracting the mean frequency of filtered EMG signal, extracted from the scale parameter estimated with two different MUAP shape values.
The present work proposes the use of modern ICT technologies such as smartphones, NFCs, internet, and web technologies, to help patients in carrying out their therapies. The implemented system provides a calendar with a reminder of the assumptions, ensures the drug identification through NFC, allows remote assistance from healthcare staff and family members to check and manage the therapy in real-time. The system also provides centralized information on the patient's therapeutic situation, helpful in choosing new compatible therapies.
This paper covers test and verification of a forecast-based Monte Carlo algorithm for an optimized, demand-oriented operation of combined heat and power (CHP) units using the hardware-in-the-loop approach. For this purpose, the optimization algorithm was implemented at a test bench at Reutlingen University for controlling a CHP unit in combination with a thermal energy storage, both in real hardware. In detail, the hardware-in-the-loop tests are intended to reveal the effects of demand forecasting accuracy, the impact of thermal energy storage capacity and the influence of load profiles on demand-oriented operation of CHP units. In addition, the paper focuses on the evaluation of the content of energy in the thermal energy storage under practical conditions. It is shown that a 5-layer model allows to determine the energy stored quite accurately, which is verified by experimental results. The hardware-in-the-loop tests disclose that demand forecasting accuracies, especially electricity demand forecasting, as well as load profiles strongly impact the potential for CHP electricity utilization on-site in demand-oriented mode. Moreover, it is shown that a larger effective capacity of the thermal energy storage positively affects demand-oriented operation. In the hardware-in-the-loop tests, the fraction of electricity generated by the CHP unit utilized on-site could thus be increased by a maximum of 27% compared to heat-led operation, which is still the most common modus operandi of small-scale CHP plants. Hence, the hardware-in-the-loop tests were adequate to prove the significant impact of the proposed algorithm for optimization of demand-oriented operation of CHP units.