Refine
Document Type
- Journal article (199)
- Book chapter (21)
- Conference proceeding (18)
- Anthology (2)
- Book (1)
- Review (1)
- Working Paper (1)
Is part of the Bibliography
- yes (243)
Institute
- ESB Business School (96)
- Life Sciences (66)
- Informatik (47)
- Technik (25)
- Texoversum (9)
Publisher
- Elsevier (243) (remove)
Rare but extreme events, such as pandemics, terror attacks, and stock market collapses, pose a risk that could undermine cooperation in societies and groups. We extend the public goods game (PGG) to investigate the relationship between rare but extreme external risks and cooperation in a laboratory experiment. By incorporating risk as an external random variable in the PGG, independent of the participants’ contributions, we preserve the economic equilibrium of non-cooperation in the original game. Furthermore, we examine whether cooperation can be restored by the relatively simple intervention of informing about countermeasures while keeping the actual risk constant. Our experimental results reveal that on average extreme risks indeed decrease contributions by about 20%; however, countermeasure information increases contributions by about 10%. Specifically, in the first interactions, cooperation levels can even reach those observed in the riskless baseline. Our results suggest that countermeasure information could help reinforce social cohesion and resilience in the face of rare but extreme risks.
Wave-like differential equations occur in many engineering applications. Here the engineering setup is embedded into the framework of functional analysis of modern mathematical physics. After an overview, the –Hilbert space approach to free Euler–Bernoulli bending vibrations of a beam in one spatial dimension is investigated. We analyze in detail the corresponding positive, selfadjoint differential operators of 4-th order associated to the boundary conditions in statics. A comparison with free string wave swinging is outlined.
Automatic content creation system for augmented reality maintenance applications for legacy machines
(2024)
Augmented reality (AR) applications have great potential to assist maintenance workers in their operations. However, creating AR solutions is time-consuming and laborious, which limits its widespread adoption in the industry. It therefore often happens that even with the latest generation machines, instead of an AR solution, the user only receives an electronic manual for the equipment operation and maintenance. This is commonplace with legacy machines. For this reason, solutions are required that simplify the creation of such AR solutions. This paper presents an approach using an electronic manual as a basis to create fast and cost-effective AR solutions for maintenance. As part of the approach, an application was developed to automatically identify and subdivide the chapters of electronic manuals via the bookmarks in the table of contents. The contents are then automatically uploaded to a central server and indexed with a suitable marker to make the data retrievable. The prepared content can then be accessed for creating context-related AR instructions via the marker. The application is characterized by the fact that no developers or experts are required to prepare the information. In addition to complying with common design criteria, the clear presentation of the contents and the intuitive use of the system offer added value for the performance of maintenance tasks. Together, these two elements form a novel way to retrofit legacy machines with AR maintenance instructions. The practical validation of the system took place in a factory environment. For this purpose, the content was created for a filter change on a CNC milling machine. The results show that inexperienced users can extract appropriate content with the software application. Furthermore, it is shown that maintenance workers, can access the content with an AR application developed for the Microsoft HoloLens 2 and complete simple tasks provided in the manufacturer's electronic manual.
Plasmonics and nanophotonics both deal with the interaction of light with structures of typically sub-wavelength size in one of more dimensions. Over the past decade or two, interest in these topics has grown significantly. This includes basic research towards detailed understanding of light-matter interaction and the manipulation of light on the nanometer scale as well as the search for applications ranging from quantum information processing, data storage, solar cells, spectroscopy and microscopy to (bio-)sensors and biomedical devices. Key enablers for this development are advanced materials and the variety of techniques to structure them with nanometer precision on the one hand, and progress in the theoretical description and numerical implementations, on the other. Besides the traditional metals Au, Ag, Al, and Cu also compounds such as refractory metal nitrides with much higher durability as well as semiconductors, dielectrics and hybrid structures have become of interest. Structuring techniques are not only aiming at the fabrication of individual elements with highest precision for detailed interaction analysis, but also at methods for large scale, low-cost nanofabrication mostly for sensor applications. In the former case, mostly electron beam lithography and focused ion beam milling are employed, while for high throughput various forms of nanoimprint and self-assembly based techniques are favored. Thin film deposition and pattern transfer techniques are mostly derived from those developed for nano-electronics, however more recently methods such as electroless plating, atomic layer deposition or etching and 3-D additive techniques are appearing. Thus, highly specialized expertise has been acquired in the different disciplines, and successful research and technology transfer will draw from this pool of knowledge.
Different network architectures are being used to build remote laboratories. Historically, it has been difficult to integrate industrial control systems with higher level IT systems like enterprise resource planning (ERP), manufacturing execution systems (MES), and manufacturing operations management (MOM). Getting these systems to communicate with one another has proven to be relatively difficult due to the absence of shared protocols between them. The Open Platform Communications United Architecture (OPC-UA) protocol was introduced as a remedy for this issue and is gaining popularity, but what if open-source protocols that are widely used in the IT industry could be used instead? This paper presents the development of an IT-Architecture for a cyber-physical industrial control systems laboratory that enables a seamless interconnection and integration of its elements. The architecture utilises Node-Red technology. Node-RED is an open-source programming platform developed by IBM that is focused on making it simple to link physical components, APIs, and web services. This cyber-physical laboratory is for learning principles of an industrial cascaded process control factory. Finally, this text will also discuss future work relating to digital twin (DT). A coupled tank system is selected as a teaching factory to illustrate a range of fluid control application in a typical chemical process factory.
In the context of Industry 4.0, intralogistics faces an increasingly complex and dynamic environment driven by a high level of product customisation and complex manufacturing processes. One approach to deal with these changing conditions is the decentralised and intelligent connectivity of intralogistics systems. However, wireless connectivity presents a major challenge in the industry due to strict requirements such as safety and real-time data transmission. In this context, the fifth generation of mobile communications (5G) is a promising technology to meet the requirements of safety-critical applications. Particularly, since 5G offers the possibility of establishing private 5G networks, also referred to as standalone non-public networks. Through their isolation from public networks, private 5G networks provide exclusive coverage for private organisations offering them high intrinsic network control and data security. However, 5G is still under development and is being gradually introduced in a continuous release process. This process lacks transparency regarding the performance of 5G in individual releases, complicating the successful adoption of 5G as an industrial communication. Additionally, the evaluation of 5G against the specified target performance is insufficient due to the impact of the environment and external interfering factors on 5G in the industrial environment. Therefore, this paper aims to develop a technical decision-support framework that takes a holistic approach to evaluate the practicality of 5G for intralogistics use cases by considering two fundamental stages. The first of these analyses technical parameters and characteristics of the use case to evaluate the theoretical feasibility of 5G. The second stage investigates the application's environment, which substantially impacts the practicality of 5G, for instance, the influence of surrounding materials. Finally, a case study validates the proposed framework by means of an autonomous mobile robot. As a result, the validation proves the proposed framework's applicability and shows the practicality of the autonomous mobile robot, when integrating it into a private 5G network testbed.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
The present study investigated the possibilities and limitations of using a low-cost NIR spectrometer for the verification of the presence of the declared active pharmaceutical ingredients (APIs) in tablet formulations, especially for medicine screening studies in low-resource settings. Spectra from 950 to 1650 nm were recorded for 170 pharmaceutical products representing 41 different APIs, API combinations or placebos. Most of the products, including 20 falsified medicines, had been collected in medicine quality studies in African countries. After exploratory principal component analysis, models were built using data-driven soft independent modelling of class analogy (DD-SIMCA), a one-class classifier algorithm, for tablet products of penicillin V, sulfamethoxazole/trimethoprim, ciprofloxacin, furosemide, metronidazole, metformin, hydrochlorothiazide, and doxycycline. Spectra of amoxicillin and amoxicillin/clavulanic acid tablets were combined into a single model. Models were tested using Procrustes cross-validation and by projection of spectra of tablets containing the same or different APIs. Tablets containing no or different APIs could be identified with 100 % specificity in all models. A separation of the spectra of amoxicillin and amoxicillin/clavulanic acid tablets was achieved by partial least squares discriminant analysis. 15 out of 19 external validation products (79 %) representing different brands of the same APIs were correctly identified as members of the target class; three of the four rejected samples showed an API mass percentage of the total tablet weight that was out of the range covered in the respective calibration set. Therefore, in future investigations larger and more representative spectral libraries are required for model building. Falsified medicines containing no API, incorrect APIs, or grossly incorrect amounts of the declared APIs could be readily identified. Variation between different NIR-S-G1 spectroscopic devices led to a loss of accuracy if spectra recorded with different devices were pooled. Therefore, piecewise direct standardization was applied for calibration transfer. The investigated method is a promising tool for medicine screening studies in low-resource settings.
The Circular Economy aims to reintroduce the value of products back into the economic cycle at the same value chain level. While the activities of the Circular Economy are already well-defined, there exists a gap in how returned products are treated by the industry. This study aims to examine how a process should be designed to handle returned products in the context of the Circular Economy. To achieve this, a machine learning-based algorithm is used to classify data and extract relevant information throughout the product life cycle. The focus of this research is limited to land transportation systems within the Sharing Economy sector.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
Context
In a world of high dynamics and uncertainties, it is almost impossible to have a long-term prediction of which products, services, or features will satisfy the needs of the customer. To counter this situation, the conduction of Continuous Improvement or Design Thinking for product discovery are common approaches. A major constraint in conducting product discovery activities is the high effort to discover and validate features and requirements. In addition, companies struggle to integrate product discovery activities into their agile processes and iterations.
Objective
This paper aims at suggests a supportive tool, the “Discovery Effort Worthiness (DEW) Index”, for product owners and agile teams to determine a suitable amount of effort that should be spent on Design Thinking activities. To operationalize DEW, proposals for practitioners are presented that can be used to integrate product discovery into product development and delivery.
Method
A case study was conducted for the development of the DEW index. In addition, we conducted an expert workshop to develop proposals for the integration of product discovery activities into the product development and delivery process.
Results
First, we present the "Discovery Effort Worthiness Index" in form of a formula. Second, we identified requirements that must be fulfilled for systematic integration of product discovery activities into product development and delivery. Third, we derived from the requirements proposals for the integration of product discovery activities with a company's product development and delivery.
Conclusion
The developed "Discovery Effort Worthiness Index" provides a tool for companies and their product owners to determine how much effort they should spend on Design Thinking methods to discover and validate requirements. Integrating product discovery with product development and delivery should ensure that the results of product discovery are incorporated into product development. This aims to systematically analyze product risks to increase the chance of product success.
The Industry 4.0 paradigm requires concepts for integrating intelligent/ smart IoT Solutions into manufacturing. Such intelligent solutions are envisioned to increase flexibility and adaptability in smart factories. Especially autonomous cobots capable of adapting to changing conditions are a key enabler for changeable factory concepts. However, identifying the requirements and solution scenarios incorporating intelligent products challenges the manufacturing industry, especially in the SME sector. In pick and place scenarios, changing coordinate systems of workpiece carriers cause placing process errors. Using the IPIDS framework, this paper describes the development of a tool-center-point positioning method to improve the process stability of a collaborative robot in a changeable assembly workstation. Applying the framework identifies the requirement for an intelligent workpiece carrier as a part of the solution. Implementing and evaluating the solution within a changeable factory validates the IPIDS framework.
Due to constantly changing conditions, demand, and technologies, companies increasingly seek flexibility. Productivity results from automation, improved working conditions and the focus of people in production in interaction with machines. Unfortunately, the human factor is often not considered to increase flexibility and productivity with new concepts. This work aims to develop a hybrid assistance system that allows a dynamic configuration of cyber-physical production systems considering the current order situation and available resources utilizing simulation. The system also considers human factors in addition to economic factors, which contributes to the extended economic appraisal.
Film formation of self synthesized Polymer EPM–g–VTMDS (ethylene–propylene rubber, EPM, grafted with vinyltetramethyldisiloxane, VTMDS) was studied regarding bonding to adhesion promoter vinyltrimethoxysilane (VTMS) on oxidized 18/10 chromium/nickel–steel (V2A) stainless steel surfaces. Polymer films of different mixed solutions including commercial siloxane and silicone, dimethyl, vinyl group terminated crosslinker (HANSA SFA 42100, CAS# 68083-19-2, 0.35 mmol Vinyl/g) and platinum, 1,3-diethenyl-1,1,3,3-tetramethyldisiloxane complex Karstedt's catalyst (ALPA–KAT 1, CAS# 68478-92-2) were spin coated on V2A stainless steel surfaces with adsorbed VTMS thin layers in order to analyze film formation of EPM–g–VTMDS at early stages. Surface topography and chemical bonding of the high performance polymers on different oxidized V2A surfaces were investigated with X–ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM), scanning electron microscopy (SEM) and surface enhanced Raman spectroscopy (SERS). AFM and SEM as well as XPS results indicated that the formation of the polymer film proceeds via growth of polymer islands. Chemical signatures of the essential polymer contributions, linker and polymer backbones, could be identified using XPS core level peak shape analysis and also SERS. The appearance of signals which are related to Si–O–Si can be seen as a clear indication of lateral crosslinking and silica network formation in the films on the V2A surface.
Mesoporous silica microspheres (MPSMs) find broad application as separation materials in high liquid chromatography (HPLC). A promising preparation strategy uses p(GMA-co-EDMA) as hard templates to control the pore properties and a narrow size distribution of the MPMs. Here six hard templates were prepared which differ in their porosity and surface functionalization. This was achieved by altering the ratio of GMA to EDMA and by adjusting the proportion of monomer and porogen in the polymerization process. The various amounts of GMA incorporated into the polymer network of P1-6 lead to different numbers of tetraethylene pentamine in the p(GMA-co-EDMA) template. This was established by a partial least squares regression (PLS-R) model, based on FTIR spectra of the templates. Deposition of silica nanoparticles (SNP) into the template under Stoeber conditions and subsequent removal of the polymer by calcination result in MPSM1-6. The size of the SNPs and their incorporation depends on the pore parameters of the template and degree of TEPA functionalization. Moreover, the incorporated SNPs construct the silica network and control the pore parameters of the MPSMs. Functionalization of the MPSMs with trimethoxy (octadecyl) silane allows their use as a stationary phase for the separation of biomolecules. The pore characteristics and the functionalization of the template determine the pore structure of the silica particles and, consequently, their separation properties.
The fifth generation of mobile communication (5G) is a wireless technology developed to provide reliable, fast data transmission for industrial applications, such as autonomous mobile robots and connect cyber-physical systems using Internet of Things (IoT) sensors. In this context, private 5G networks enable the full performance of industrial applications built on dedicated 5G infrastructures. However, emerging wireless communication technologies such as 5G are a complex and challenging topic for training in learning factories, often lacking physical or visual interaction. Therefore, this paper aims to develop a real-time performance monitoring system of private 5G networks and different industrial 5G devices to visualise the performance and impact factors influencing 5G for students and future connectivity experts. Additionally, this paper presents the first long-term measurements of private 5G networks and shows the performance gap between the actual and targeted performance of private 5G networks.
Since its first publication in 2015, the learning factory morphology has been frequently used to design new learning factories and to classify existing ones. The structuring supports the concretization of ideas and promotes exchange between stakeholders.
However, since the implementation of the first learning factories, the learning factory concept has constantly evolved.
Therefore, in the Working Group "Learning Factory Design" of the International Association of Learning Factories, the existing morphology has been revised and extended based on an analysis of the trends observed in the evolution of learning factory concepts. On the one hand, new design elements were complemented to the previous seven design dimensions, and on the other hand, new design dimensions were added. The revised version of the morphology thus provides even more targeted support in the design of new learning factories in the future.
High-performance liquid chromatography is one of the most important analytical tools for the identification and separation of substances. The efficiency of this method is largely determined by the stationary phase of the columns. Although monodisperse mesoporous silica microspheres (MPSM) represent a commonly used material as stationary phase their tailored preparation remains challenging. Here we report on the synthesis of four MPSMs via the hard template method. Silica nanoparticles (SNPs) which form the silica network of the final MPSMs were generated in situ from tetraethyl orthosilicate (TEOS) in the presence of (3-aminopropyl) triethoxysilane (APTES) functionalized p(GMA-co-EDMA) as hard template. Methanol, ethanol, 2-propanol, and 1-butanol were applied as solvents to control the size of the SNPs in the hybrid beads (HB). After calcination, MPSMs with different sizes, morphology and pore properties were obtained and characterized by scanning electron microscopy, nitrogen adsorption and desorption measurements, thermogravimetric analysis, solid state NMR and DRIFT IR spectroscopy. Interestingly, the 29Si NMR spectra of the HBs show T and Q group species which suggests that there is no covalent linkage between the SNPs and the template. The MPSMs were functionalized with trimethoxy (octadecyl) silane and used as stationary phases in reversed-phase chromatography to separate a mixture of eleven different amino acids. The separation characteristics of the MPSMs strongly depend on their morphology and pore properties which are controlled by the solvent during the preparation of the MPSMs. Overall, the separation behavior of the best phases is comparable with those of commercially available columns. The phases even achieve faster separation of the amino acids without loss of quality.
The increase in product variance and shorter product lifecycles result in higher production ramp-up frequencies and promote the usage of mixed-model lines. The ramp-up is considered a critical step in the product life cycle and in the automotive industry phases of the ramp-up are often executed on separated production lines (pilot lines) or factories (pilot plants) to verify processes and to qualify employees without affecting the production of other products in the mixed-model line. The required financial funds for planning and maintaining dedicated pilot lines prevent small and medium-sized enterprises (SMEs) from the application. Hence, SMEs require different tools for piloting and training during the production ramp-up. Learning islands on which employees can be trained through induced and autonomous learning propose a solution. In this work, a concept for the development and application which contains the required organization, activities, and materials is developed through expert interviews. The results of a case study application with a medium-sized automotive manufacturer show that learning islands are a viable tool for employee qualification and process verification during the ramp-up of mixed-model lines.
Monitoring heart rate and breathing is essential in understanding the physiological processes for sleep analysis. Polysomnography (PSG) system have traditionally been used for sleep monitoring, but alternative methods can help to make sleep monitoring more portable in someone's home. This study conducted a series of experiments to investigate the use of pressure sensors placed under the bed as an alternative to PSG for monitoring heart rate and breathing during sleep. The following sets of experiments involved the addition of small rubber domes - transparent and black - that were glued to the pressure sensor. The resulting data were compared with the PSG system to determine the accuracy of the pressure sensor readings. The study found that the pressure sensor provided reliable data for extracting heart rate and respiration rate, with mean absolute errors (MAE) of 2.32 and 3.24 for respiration and heart rate, respectively. However, the addition of small rubber hemispheres did not significantly improve the accuracy of the readings, with MAEs of 2.3 bpm and 7.56 breaths per minute for respiration rate and heart rate, respectively. The findings of this study suggest that pressure sensors placed under the bed may serve as a viable alternative to traditional PSG systems for monitoring heart rate and breathing during sleep. These sensors provide a more comfortable and non-invasive method of sleep monitoring. However, the addition of small rubber domes did not significantly enhance the accuracy of the readings, indicating that it may not be a worthwhile addition to the pressure sensor system.
The Covid-19 virus has triggered a worldwide pandemic and therefore many employees were required to work from home which caused numerous challenges. With the Covid-19 pandemic now in its third year, there are already several studies available on the subject of home offices. To investigate the impact of remote work on employee satisfaction and trust, this quantitative study aims to review existing results and formulate hypotheses based on a conceptual model created through a qualitative study and extensive literature review. The research question is as follows: Does home office during Covid-19 affect employee satisfaction and trust? To test the hypotheses, a structural equation model was constructed and analyzed. The culture of trust and flexibility are identified as the biggest influencing factors in this study.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
Healthy sleep is one of the prerequisites for a good human body and brain condition, including general well-being. Unfortunately, there are several sleep disorders that can negatively affect this. One of the most common is sleep apnoea, in which breathing is impaired. Studies have shown that this disorder often remains undiagnosed. To avoid this, developing a system that can be widely used in a home environment to detect apnoea and monitor the changes once therapy has been initiated is essential. The conceptualisation of such a system is the main aim of this research. After a thorough analysis of the available literature and state of the art in this area of knowledge, a concept of the system was created, which includes the following main components: data acquisition (including two parts), storage of the data, apnoea detection algorithm, user and device management, data visualisation. The modules are interchangeable, and interfaces have been defined for data transfer, most of which operate using the MQTT protocol. System diagrams and detailed component descriptions, including signal requirements and visualisation mockups, have also been developed. The system's design includes the necessary concepts for the implementation and can be realised in a prototype in the next phase.
The influence of sleep on human health is enormous. Accordingly, sleep disorders can have a negative impact on it. To avoid this, they should be identified and treated in time. For this purpose, objective (with an appropriate device) or subjective (based on perceived values) measurement methods are used for sleep analysis to understand the problem. The aim of this work is to find out whether an exchange of the two methods is possible and can provide reliable results. In accordance with this goal, a study was conducted with people aged over 65 years old (a total of 154 night-time recordings) in which both measurement methods were compared. Sleep questionnaires and electronic devices for sleep assessment placed under the mattress were applied to achieve the study aims. The obtained results indicated that the correlation between both measurement methods could be observed for sleep characteristics such as total sleep time, total time in bed and sleep efficiency. However, there are also significant differences in absolute values of the two measurement approaches for some subjects/nights, which leads us to conclude that the substitution is more likely to be considered in case of long-term monitoring where the trends are of more importance and not the absolute values for individual nights.
Development of an expert system to overpass citizens technological barriers on smart home and living
(2023)
Adopting new technologies can be overwhelming, even for people with experience in the field. For the general public, learning about new implementations, releases, brands, and enhancements can cause them to lose interest. There is a clear need to create point sources and platforms that provide helpful information about the novel and smart technologies, assisting users, technicians, and providers with products and technologies. The purpose of these platforms is twofold, as they can gather and share information on interests common to manufacturers and vendors. This paper presents the ”Finde-Dein-SmartHome” tool. Developed in association with the Smart Home & Living competence center [5] to help users learn about, understand, and purchase available technologies that meet their home automation needs. This tool aims to lower the usability barrier and guide potential customers to clear their doubts about privacy and pricing. Communities can use the information provided by this tool to identify market trends that could eventually lower costs for providers and incentivize access to innovative home technologies and devices supporting long-term care.
Managerial accountants spend a large part of their working time on more operational activities in cost accounting, reporting, and operational planning and budgeting. In all these areas, there has been increasing discussion in recent years, both in theory and practice, about using more digital technologies. For reporting, this means not only an intensified discussion of technologies such as RPA and AI but also more intensive changes to existing reporting systems. In particular, management information systems (MIS), which are maintained by managerial accountants and used by managers for corporate management, should be mentioned here. Based on an empirical survey in a large German company, this article discusses the requirements and assessments of users when switching from a regular MIS to a cloud-based system.
The development of automatic solutions for the detection of physiological events of interest is booming. Improvements in the collection and storage of large amounts of healthcare data allow access to these data faster and more efficiently. This fact means that the development of artificial intelligence models for the detection and monitoring of a large number of pathologies is becoming increasingly common in the medical field. In particular, developing deep learning models for detecting obstructive apnea (OSA) events is at the forefront. Numerous scientific studies focus on the architecture of the models and the results that these models can provide in terms of OSA classification and Apnea-Hypopnea-Index (AHI) calculation. However, little focus is put on other aspects of great relevance that are crucial for the training and performance of the models. Among these aspects can be found the set of physiological signals used and the preprocessing tasks prior to model training. This paper covers the essential requirements that must be considered before training the deep learning model for obstructive sleep apnea detection, in addition to covering solutions that currently exist in the scientific literature by analyzing the preprocessing tasks prior to training.
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
Development of an IoT-based inventory management solution and training module using smart bins
(2023)
Flexibility, transparency and changeability of warehouse environments are playing an increasingly important role to achieve a cost-efficient production of small batch sizes. This results in increasing requirements for warehouses in terms of flexibility, scalability, reconfigurability and transparency of material and information flows to deal with large number of different components and variable material and information flows due to small batch sizes. Therefore, an IoT-based inventory management solution and training module has been developed, implemented and validated at Werk150 – the Factory on campus of the ESB Business School. Key elements of the developed solution are smart bins using weight mats to track the bin’s content and additional sensors and buttons which are connected to an IoT – Hub to collect data of material consumption and manual handling operations. The use of weight mats for the smart bins offers the possibility to measure the container content independent of the specific component geometry and thus for a variety of components based on the specific component weights. The developed solution enables focusing on key for success elements of the system to provide synchronization of the flow of materials and information resulting an increase of flexibility and significantly higher transparency of the material flow. AIbased algorithms are applied to analyse the gathered data and to initiate process optimizations by providing the logistics decision makers a profound and transparent basis for decision making. In order to provide students and industry visitors of the learning factory with the necessary competences and to support the transfer into practice, a training module on IoT-based inventory management was developed and implemented.
Circular economy aims to support reuse and extends the product life cycles through repair, remanufacturing, upgrades and retrofits, as well as closing material cycles through recycling. To successfully manage the necessary transformation processes to circular economy, manufacturing enterprises rely on the competency of their employees. The definition of competency requirements for circular economy-oriented production networks will contribute to the operationalization of circular economy. The International Association of Learning Factories (IALF) statesin its mission the development of learning systems addressing these challenges for training of students and further education of industry employees. To identify the required competencies for circular economy, the major changes of the product life cycle phases have been investigated based on the state of the science and compared to the socio-technical infrastructure and thematic fields of the learning factories considered in this paper. To operationalize the circular economy approach in the product design and production phase in learning factories, an approach for a cross learning factory network (so called "Cross Learning Factory Product Production System (CLFPPS)") has been developed. The proposed CLFPPS represents a network on the design dimensions of learning factories. This approach contributes to the promotion of circular economy in learning factories as it makes use of and combines the focus areas of different learning factories. This enables the CLFPPS to offer a holistic view on the product life cycle in production networks.
The world is becoming increasingly digital. People have become used to learning and interacting with the world around them through technology, accelerated even further by the Covid-19 pandemic. This is especially relevant to the generation currently entering education systems and the workforce. Considering digital aids and methods of learning are important for future learning. The increasing online learning needs open the case for integrating digital learning aspects such as serious gaming within education and training systems. Learning factories fall amongst the education and training systems that can benefit from integration with digital learning extensions. Digital capabilities such as digital twins and models further enable the exploration of integrating digital serious games as an extension of learning factories. Since learning factories are meant for a range of different learning, training, and research purposes, such serious games need to be adaptable across stakeholder perspectives to maximize the value gained from the time and cost invested into such design and development. Research into the development of adaptive serious games for multiple stakeholder perspectives must first determine whether such development can be developed that reaches the objectives set for different included stakeholder perspectives. The purpose of this research is to investigate this at the hand of the practical development of a digital adaptive serious game for stakeholder perspectives.
Product engineering and subsequent phases of product lifecycles are predominantly managed in isolation. Companies therefore do not fully exploit potentials through using data from smart factories and product usage. The novel intelligent and integrated Product Lifecycle Management (i²PLM) describes an approach that uses these data for product engineering. This paper describes the i²PLM, shows the cause-and-effect relationships in this context and presents in detail the validation of the approach. The i²PLM is applied and validated on a smart product in an industrial research environment. Here, the subsequent generation of a smart lunchbox is developed based on production and sensor data. The results of the validation give indications for further improvements of the i²PLM. This paper describes how to integrate the i²PLM into a learning factory.
This article examines the risks and societal costs associated with flexible average inflation targeting in the United States and symmetric inflation targeting in the Eurozone. Employing an empirical approach, we analyze monthly cumulative inflation gaps over a monetary policy horizon of 36 months. By investigating the trajectories of the cumulative inflation gaps, we find a heavy tailed distribution and a 20 percent probability of over- and undershooting the inflation target. We exhibit that the offsetting mechanism introduced in the revised monetary strategies lack credibility in ensuring price stability during a period of persistent inflation. Consequently, the credibility of central banks may be compromised. The policy implications are the integration of an escape clause and prompt monetary corrections in cases where the inflation goal is not achieved. This study provides insights for policymakers and central banks, emphasizing challenges in maintaining credibility and price stability within the new monetary strategies.
In the course of a more intensive energy generation from regenerative sources, an increased number of energy storages is required. In addition to the widespread means of storing electric energy, storing energy thermally can contribute significantly. However, limited research exists on the behaviour of thermal energy storages (TES) in practical operation. While the physical processes are well known, it is nevertheless often not possible to adequately evaluate its performance with respect to the quality of thermal stratification inside the tank, which is crucial for the thermodynamic effectiveness of the TES. The behaviour of a TES is experimentally investigated in cyclic charging and discharging operation in interaction with a cogeneration (CHP) unit at a test rig in the lab. From the measurements the quality of thermal stratification is evaluated under varying conditions using different metrics such as normalised stratification factor, modified MIX number, exergy number and exergy efficiency, which extends the state of art for CHP applications. The results show that the positioning of the temperature sensors for turning the CHP unit on and off has a significant influence on both the effective capacity of a TES and the quality of thermal stratification inside the tank. It is also revealed that the positioning of at least one of these sensors outside the storage tank, i.e. in the return line to the CHP unit, prevents deterioration of thermal stratification, thereby enhancing thermodynamic effectiveness. Furthermore, the effects of thermal load and thermal load profile on effective capacity and thermal stratification are discussed, even though these are much smaller compared to the effect of positioning the temperature sensors.
The properties of polyelectrolyte multilayers are ruled by the process parameters employed during self-assembly. This is the first study in which a design of experiment approach was used to validate and control the production of ultrathin polyelectrolyte multilayer coatings by identifying the ranges of critical process parameters (polyelectrolyte concentration, ionic strength and pH) within which coatings with reproducible properties (thickness, refractive index and hydrophilicity) are created. Mathematical models describing the combined impact of key process parameters on coatings properties were developed demonstrating that only ionic strength and pH affect the coatings thickness, but not polyelectrolyte concentration. While the electrolyte concentration had a linear effect, the pH contribution was described by a quadratic polynomial. A significant contribution of this study is the development of a new approach to estimate the thickness of polyelectrolyte multilayer nanofilms by quantitative rhodamine B staining, which might be useful in all cases when ellipsometry is not feasible due to the shape complexity or small size of the coated substrate. The novel approach proposed here overcomes the limitations of known methods as it offers a low spatial sampling size and the ability to analyse a wide area without restrictions on the chemical composition and shape of the substrate.
Twitter and citations
(2023)
Social media, especially Twitter, plays an increasingly important role among researchers in showcasing and promoting their research. Does Twitter affect academic citations? Making use of Twitter activity about columns published on VoxEU, a renowned online platform for economists, we develop an instrumental variable strategy to show that Twitter activity about a research paper has a causal effect on the number of citations that this paper will receive. We find that the existence of at least one tweet, as opposed to none, increases citations by 16-25%. Doubling overall Twitter engagement boosts citations by up to 16%.
Towards a sustainable future, looking beyond the system boundaries of a single manufacturing company is necessary to promote meaningful collaborations in terms of circular economy principles. In this context digital data processing technologies to connect the potential collaborators are seen as enablers to make use of proven collaborative circular business models (CCBMs). Since most of such data processing technologies rely on features to describe the entities involved, it is essential to provide guidance for identifying and selecting the relevant and most appropriate ones. Defining critical success factors (CSFs) is considered a suitable instrument to describe the decisive factors. A systematic literature review (SLR), followed by a qualitative synthesis is investigating two scientific fields of work, namely (1) the general relevant features of CCBMs and, (2) methodologies for determining CSFs. This results in the development of a conceptual framework which provides guidance for digital applications that perform further digital processing based on the relevant CSFs relating to the specific CCBM.
Machine failures’ consequences – a classification model considering ultra-efficiency criteria
(2023)
To strive for a sustainable production, maintenance has to evaluate possible machine failure consequences not just economically but also holistically. Approaches such as the ultra-efficiency factory consider energy, material, human/staff, emission, and organization as optimization dimensions. These ultra-efficiency dimensions can be considered for analyzing not only the respective machine failure but also the effects on the entire production system holistically. This paper presents an easy to use method, based on a questionnaire, for assessing the failure consequences of a machine malfunction in a production system considering the ultra-efficiency dimensions. The method was validated in a battery production.
Using predictive maintenance, more efficient processes can be implemented, leading to fewer maintenance costs and increased availability. The development of a predictive maintenance solution currently requires high efforts in time and capacity as well as often interdisciplinary cooperation. This paper presents a standardized model to describe a predictive maintenance use case. The description model is used to collect, present, and document the required information for the implementation of predictive maintenance use cases by and for different stakeholders. Based on this model, predictive maintenance solutions can be introduced more efficiently. The method is validated across departments in the automotive sector.
The increasing complexity and need for availability of automated guided vehicles (AGVs) pose challenges to companies, leading to a focus on new maintenance strategies. In this paper, a smart maintenance architecture based on a digital twin is presented to optimize the technical and economic effectiveness of AGV maintenance activities. To realize this, a literature review was conducted to identify the necessary requirements for Smart Maintenance and Digital Twins. The identified requirements were combined into modules and then integrated into an architecture. The architecture was evaluated on a real AGV on the battery as one of the critical components.
Condition monitoring supported with artificial intelligence, cloud computing, and industrial internet of things (IIoT) technologies increases the feasibility of predictive maintenance. However, the cost of traditional sensors, data acquisition systems, and the required information technology expert-knowledge challenge the industry. This paper presents a hybrid condition monitoring system (CMS) architecture consisting of a distributed, low-cost IIoT-sensor solution. The CMS uses micro-electro-mechanical system (MEMS) microphones for data acquisition, edge computing for signal preprocessing, and cloud computing, including artificial neural networks (ANN) for higher-level information processing. The system's feasibility is validated using a testbed for reciprocating linear-motion axes.
Herein, biochar from biomass residues is demonstrated as active materials for the catalytic cracking of waste motor oil into diesel-like fuels. Above all, alkali-treated rice husk biochar showed great activity with a 250% increase in the kinetic constant compared to the thermal cracking. It also showed better activity than synthetic materials, as previously reported. Moreover, much lower activation energy (185.77 to 293.48 kJ/mol) for the cracking process was also obtained. According to materials characterization, the catalytic activity was more related to the nature of the biochar’s surface than its specific surface area. Finally, liquid products complied with all the physical properties defined by international standards for diesel-like fuels, with the presence of hydrocarbons chains between C10 - C27 similar to the ones obtained in commercial diesel.
Digital twins deployed in production are important in practice and interesting for research. Currently, mostly structured data coming from e.g., sensors and timestamps of related stations, are integrated into Digital Twins. However, semi- and unstructured data are also important to display the current status of a digital twin (e.g., of a machinery or produced good). Process Mining and Text Mining in combination can be used to support the use of log file data to understand the current state of the process as well as highlight issues. Therefore, issue related reactions can be taken more quickly, targeted and cost oriented. Applying a design science research approach; here a prototype as an artefact based on derived requirements is developed. This prototype helps to understand and to clarify the possibilities of Process Mining and Text Mining based on log data for production related Digital Twins. Contributions for practice and research are described. Furthermore, limitations of the research and future opportunities are pointed out.
This paper presents the first part of a research-work conducted at the University of Applied Sciences (HFT- Stuttgart). The aim of the research was to investigate the potential of low-cost renewable energy systems to reduce the energy demand of the building sector in hot and dry areas. Radiative cooling to the night sky represents a low-cost renewable energy source. The dry desert climate conditions promote radiative cooling applications. The system technology adopted in this work is based on uncovered solar thermal collectors integrated into the building’s hydronic system. By implementing different control strategies, the same system could be used for cooling as well as for heating applications. This paper focuses on identifying the collector parameters which are required as the coefficients to configure such an unglazed collector for calibrating its mathematical model within the simulation environment. The parameter identification process implies testing the collector for its thermal performance. This paper attempts to provide an insight into the dynamic testing of uncovered solar thermal collectors (absorbers), taking into account their prospective operation at nighttime for radiative cooling applications. In this study, the main parameters characterizing the performance of the absorbers for radiative cooling applications are identified and obtained from standardized testing protocol. For this aim, a number of plastic solar absorbers of different designs were tested on the outdoor test-stand facility at HFT-Stuttgart for the characterization of their thermal performance. The testing process was based on the quasi-dynamic test method of the international standard for solar thermal collectors EN ISO 9806. The test database was then used within a mathematical optimization tool (GenOpt) to determine the optimal parameter settings of each absorber under testing. Those performance parameters were significant to compare the thermal performance of the tested absorbers. The coefficients (identified parameters) were used then to plot the thermal efficiency curves of all absorbers, for both the heating and cooling modes of operation. Based on the intended main scope of the system utilization (heating or cooling), the tested absorbers could be benchmarked. Hence, one of those absorbers was selected to be used in the following simulation phase as was planned in the research-project.
In countries such as Germany, where municipalities have planning sovereignty, problems of urban sprawl often arise. As the dynamics of land development have not substantially subsided over the last years, the national government decided to test the instrument of ‘Tradable Planning Permits’ (TPP) in a nationwide field experiment with 87 municipalities involved. The field experiment was able to implement the key features of a TPP system in a laboratory setting with approximated real socioeconomic and planning conditions. In a TPP system allocated planning permits must be used by municipalities for developing land. The permits can be traded between local jurisdictions, so that they have flexibility in deciding how to comply with the regulation. In order to evaluate the performance of such a system, specific field data about future building areas and their impact on community budgets for the period 2014–2028 were collected. The field experiment contains several sessions with representatives of the municipalities and with students. The participants were confronted with two (municipalities) and four (students) schemes. The results show that a trading system can curb down land development in an effective and also efficient manner. However, depending on the regulatory framework, the trading schemes show different price developments and distributional effects. The unexperienced representatives of the local authorities can easily handle with the permits in the administration and in the established market. A trading scheme sets very high incentives to save open space and to direct development activities to areas within existing planning boundaries. It is therefore a promising instrument for Germany and also other regions or countries with an established land-use planning system.
We analyze economics PhDs’ collaborations in peer-reviewed journals from 1990 to 2014 and investigate such collaborations’ quality in relation to each co-author’s research quality, field and specialization. We find that a greater overlap between co-authors’ previous research fields is significantly related to a greater publication success of co-authors’ joint work and this is robust to alternative specifications. Co-authors that engage in a distant collaboration are significantly more likely to have a large research overlap, but this significance is lost when co-authors’ social networks are accounted for. High quality collaboration is more likely to emerge as a result of an interaction between specialists and generalists with overlapping fields of expertise. Regarding interactions across subfields of economics (interdisciplinarity), it is more likely conducted by co- authors who already have interdisciplinary portfolios, than by co-authors who are specialized or starred in different subfields.
COVID-19 and educational inequality: How school closures affect low- and high-achieving students
(2021)
In spring 2020, governments around the globe shut down schools to mitigate the spread of the novel coronavirus. We argue that low-achieving students may be particularly affected by the lack of educator support during school closures. We collect detailed time-use information on students before and during the school closures in a survey of 1099 parents in Germany. We find that while students on average reduced their daily learning time of 7.4 h by about half, the reduction was significantly larger for low-achievers (4.1 h) than for high-achievers (3.7 h). Low-achievers disproportionately replaced learning time with detrimental activities such as TV or computer games rather than with activities more conducive to child development. The learning gap was not compensated by parents or schools who provided less support for low-achieving students.
This paper presents a description model for smart, connected devices used in a manufacturing context. Similar to the wide spread adoption of smart products for personal and private usage, recent developments lead to a plethora of devices offering a variety of features and capabilities. Manufacturing companies undergoing digital transformation demand guidance with respect to the systematic introduction of smart, connected devices. The introduction of smart connected devices constitutes a strategic decision cost due to the high future committed cost after introduction and maintaining a smart device fleet by a vendor. This paper aims to support the introduction efforts by classifying the devices and thus helping companies identify their specific requirements for smart, connected devices before initiating widespread procurement. By mapping the features of these devices based on various attributes, allows the clustering of smart, connected devices including a requirement list for their implementation on the shopfloor. Four individual commercially available smart connected devices were analyzed using the description model.
Parallel grippers offer multiple applications thanks to their flexibility. Their application field ranges from aerospace and automotive to medicine and communication technologies. However, the application of grippers has the problem of exhibition wear and errors during the execution of their operation. This affects the performance of the gripper. In this context, the remaining useful life (RUL) defines the remaining lifespan until failure for an asset at a particular time of operation occurs. The exact lifespan of an asset is uncertain, thus the RUL model and estimation must be derived from available sources of information. This paper presents a method for the estimation of the RUL for a two-jaw parallel gripper. After the introduction to the topic, an overview of existing literature and RUL methods are presented. Subsequently, the method for estimating the RUL of grippers is explained. Finally, the results are summarized and discussed before the outlook and further challenges are presented.
The general conclusion of climate change studies is the necessity of eliminating net CO2 emissions in general and from the electric power systems in particular by 2050. The share of renewable energy is increasing worldwide, but due to the intermittent nature of wind and solar power, a lack of system flexibility is already hampering the further integration of renewable energy in some countries. In this study, we analyze if and how combinations of carbon pricing and power-to-gas (PtG) generation in the form of green power-to-hydrogen followed by methanation (which we refer to as PtG throughout) using captured CO2 emissions can provide transitions to deep decarbonization of energy systems. To this end, we focus on the economics of deep decarbonization of the European electricity system with the help of an energy system model. In different scenario analyses, we find that a CO2 price of 160 €/t (by 2050) is on its own not sufficient to decarbonize the electricity sector, but that a CO2 price path of 125 (by 2040) up to 160 €/t (by 2050), combined with PtG technologies, can lead to an economically feasible decarbonization of the European electricity system by 2050. These results are robust to higher than anticipated PtG costs.