Refine
Document Type
- Journal article (199)
- Book chapter (21)
- Conference proceeding (18)
- Anthology (2)
- Book (1)
- Review (1)
- Working Paper (1)
Is part of the Bibliography
- yes (243)
Institute
- ESB Business School (96)
- Life Sciences (66)
- Informatik (47)
- Technik (25)
- Texoversum (9)
Publisher
- Elsevier (243) (remove)
The proper selection of a demand forecasting method is directly linked to the success of supply chain management (SCM). However, today’s manufacturing companies are confronted with uncertain and dynamic markets. Consequently, classical statistical methods are not always appropriate for accurate and reliable forecasting. Algorithms of Artificial intelligence (AI) are currently used to improve statistical methods. Existing literature only gives a very general overview of the AI methods used in combination with demand forecasting. This paper provides an analysis of the AI methods published in the last five years (2017-2021). Furthermore, a classification is presented by clustering the AI methods in order to define the trend of the methods applied. Finally, a classification of the different AI methods according to the dimensionality of data, volume of data, and time horizon of the forecast is presented. The goal is to support the selection of the appropriate AI method to optimize demand forecasting.
On-chip metallization, especially in modern integrated BCD technologies, is often subject to high current densities and pronounced temperature cycles due to heat dissipation from power switches like LDMOS transistors. This paper continues the work on a sensor concept where small sense lines are embedded in the metallization layers above the active area of a switching LDMOS transistor. The sensors show a significant resistance change that correlates with the number of power cycles. Furthermore, influences of sense line layer, geometry and the dissipated energy are shown. In this paper, the focus lies on a more detailed analysis of the observed change in sense line resistance.
Relationship marketing is an important issue in every business. Knowing the customers and establishing, maintaining and enhancing long-term customer relationships is a key component of long-term business success. Considering that sport is such big business today, it is surprising that this crucial approach to marketing has yet to be fully recognised either in literature or in the sports business itself. Relationship Marketing in Sports aims to fill this void by discussing and reformulating the principles of relationship marketing and by demonstrating how relationship marketing can be successfully applied in practice within a sports context. Written by a unique author team of academic and practitioner experience, the book provides the reader with: the first book to apply the principles of relationship marketing specifically to a sports context case studies from around the world to provide a uniquely global approach applicable worldwide strong pedagogical features including learning outcomes, overviews, discussion questions, glossary, guided reading and web links practical advice for professional, semi-professional and non-professional sporting organisations a companion website providing web links, case studies and PowerPoint slides for lecturers. Relationship Marketing in Sports is crucial reading for both students and professionals alike and marks a turning point in the marketing of sports.
Reflectometry is known since long as an interferometric method which can be used to characterize surfaces and thin films regarding their structure and,to a certain degree,composition as well.Properties like layer structures,layer thickness,density,and interface roughness can be determined by fitting the obtained reflectivity data with an appropriate model using a recursive fitting routine. However,one major drawback of the reflectometric method is its restriction to planar surfaces.In this article we demonstrate an approach to apply X-ray and neutron reflectometry to curved surfaces by means of the example of bent bare and coated glass slides.We prove the possibility to observe all features like Fresnel decay,Kiessig fringes,Bragg peaks and off-specular scattering and are able to interpret the data using common fitting software and to derive quantitative results about roughness,layer thickness and internal structure. The proposed method has become practical due to the availability of high quality 2D-detectors. It opens up the option to explore many kinds and shapes of samples,which,due to their geometry,have not been in the focus of reflectometry techniques until now.
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.
Comparative analysis of the R&D efficiency of 14 leading pharmaceutical companies for the years 1999–2018 shows that there is a close positive correlation between R&D spending and the two investigated R&D output parameters, approved NMEs and the cumulative impact factor of their publications. In other words, higher R&D investments (input) were associated with higher R&D output. Second, our analyses indicate that there are "economies of scale" (size) in pharmaceutical R&D.
Employing diffuse reflection ultraviolet visible (UV–Vis) spectroscopy we developed an approach that is capable to quantitatively determine flux residues on a technical copper surface. The technical copper surface was soldered with a no-clean flux system of organic acids. By a post-solder cleaning step with different cleaning parameters, various levels of residues were produced. The surface was quantitatively and qualitatively characterized using X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES), Fourier transform infrared spectroscopy (FTIR) and diffuse reflection UV–Vis spectroscopy. With the use of a multivariate analysis (MVA) we examined the UV–Vis data to create a correlation to the carbon content on the surface. The UV–Vis data could be discriminated for all groups by their level of organic residues. Combined with XPS the data were evaluated by a partial least squares (PLS) regression to establish a model. Based on this predictive model, the carbon content was calculated with an absolute error of 2.7 at.%. Due to the high correlation of predictive model, the easy-to-use measurement and the evaluation by multivariate analysis the developed method seems suitable for an online monitoring system. With this system, flux residues can be detected in a manufacturing cleaning process of technical surfaces after soldering.
Pultrusion of braids
(2016)
Properties data of phenolic resins synthetized for the impregnation of saturating Kraft paper
(2018)
The quality of decorative laminates boards depends on the impregnation process of Kraft papers with a phenolic resin,which constitute the raw materials for the manufacture of the cores of such boards.In the laminates industries,the properties of resins are adapted via their syntheses,usually by mixing phenol and formaldehyde in a batch,where additives,temperature and stirring parameters can be controlled. Therefore, many possibilities of preparation and phenolic resins exist, that leads to different combinations of physico chemical properties. In this article, the properties data of eight phenolic resins synthetized with different parameters of pH and reaction times at 60 °C and 90 °C are presented: the losses of pH after synthesis and the dynamic viscosities measured after synthesis and one the solid content is adjusted to 45%w/w in methanol. Data aquired by Differential Scanning Calorimetry (DSC) of the resins and Inverse Gas Chromatography (IGC) of cured solids are given as well.
The increase in product variance and shorter product lifecycles result in higher production ramp-up frequencies and promote the usage of mixed-model lines. The ramp-up is considered a critical step in the product life cycle and in the automotive industry phases of the ramp-up are often executed on separated production lines (pilot lines) or factories (pilot plants) to verify processes and to qualify employees without affecting the production of other products in the mixed-model line. The required financial funds for planning and maintaining dedicated pilot lines prevent small and medium-sized enterprises (SMEs) from the application. Hence, SMEs require different tools for piloting and training during the production ramp-up. Learning islands on which employees can be trained through induced and autonomous learning propose a solution. In this work, a concept for the development and application which contains the required organization, activities, and materials is developed through expert interviews. The results of a case study application with a medium-sized automotive manufacturer show that learning islands are a viable tool for employee qualification and process verification during the ramp-up of mixed-model lines.
Process analysis and process control have attracted increasing interest in recent years. The development and application of process analytical methods are a prerequisite for the knowledge-based manufacturing of industrial goods and allow for the production of high-value products of defined, constantly good quality. Discussed in this chapter are the measurement principle and some relevant aspects and illustrative examples of online monitoring tools as the basis for process control in the manufacturing and processing of thermosetting resins. Optical spectroscopy is featured as one of the main process analytical methods applicable to, among other applications, online monitoring of resin synthesis. In combination with chemometric methods for multivariate data analysis, powerful process models can be generated within the framework of feedback and feed-forward control concepts. Other analytical methods covered in this chapter are those frequently used to control further processing of thermosets to the final parts, including dielectric analysis, ultrasonics, fiber optics, and Fiber Bragg Grating sensors.
Processing
(2014)
In this chapter, some relevant aspects and illustrative examples of online monitoring tools as the basis for process control in the manufacturing and processing of thermosetting resins are briefly discussed. In principle, any chemical or physical information made accessible by sensors can be used for online monitoring of resin formation, resin location in the mold, and resin cure. For instance, changes in the flow properties of the reaction mixture are often routinely recorded in dependence of the reaction time during resin synthesis as a measure for the degree of conversion of raw materials into macromolecules or oligomers by applying rheometry in an in-process environment. Typically, a small sample of the reaction mixture is by-passed, subjected to rheological measurement, and re-introduced into the bulk reactor. In a similar way, pH measurements, turbidimetric measurements, or other analyses are performed. Although rheometry may not always be suitable for following resin cure (especially in cases where there is a very rapid increase in viscosity after initiation of the cure), [1] naturally, the method can in principle also be used in the subsequent processing of the thermosets, for instance in the curing of wood glue applied to wood specimen [2]. Similarly, pH changes during thermoset curing can be followed. Hence, an encyclopedic and comprehensive approach to present process control methods would systematically proceed according to the involved physical measurement principle. However, since only a very Brief sketch of means for monitoring thermoset processing can be given here, only a small, personally biased selection of important methods and application examples is addressed in the following sections. These examples hopefully illustrate some of the general strategies and solutions to problems that are typically encountered when processing thermosets.
The properties of polyelectrolyte multilayers are ruled by the process parameters employed during self-assembly. This is the first study in which a design of experiment approach was used to validate and control the production of ultrathin polyelectrolyte multilayer coatings by identifying the ranges of critical process parameters (polyelectrolyte concentration, ionic strength and pH) within which coatings with reproducible properties (thickness, refractive index and hydrophilicity) are created. Mathematical models describing the combined impact of key process parameters on coatings properties were developed demonstrating that only ionic strength and pH affect the coatings thickness, but not polyelectrolyte concentration. While the electrolyte concentration had a linear effect, the pH contribution was described by a quadratic polynomial. A significant contribution of this study is the development of a new approach to estimate the thickness of polyelectrolyte multilayer nanofilms by quantitative rhodamine B staining, which might be useful in all cases when ellipsometry is not feasible due to the shape complexity or small size of the coated substrate. The novel approach proposed here overcomes the limitations of known methods as it offers a low spatial sampling size and the ability to analyse a wide area without restrictions on the chemical composition and shape of the substrate.
Preliminary results of homomorphic deconvolution application to surface EMG signals during walking
(2021)
Homomorphic deconvolution is applied to sEMG signals recorded during walking. Gastrocnemius lateralis and tibialis anterior signals were acquired according to SENIAM recommendation. MUAP parameters like amplitude and scale were estimated, whilst the MUAP shape parameter was fixed. This features a useful time-frequency representation of sEMG signal. Estimation of scale MUAP parameter was verified extracting the mean frequency of filtered EMG signal, extracted from the scale parameter estimated with two different MUAP shape values.
Clay minerals play an increasingly important role as functional fillers and reinforcing materials for clay polymer nanocomposites (CPN) in advanced applications. Among the prerequisites necessary for polymer improvement by clay minerals are homogeneous and stable Distribution of the clay mineral throughout the CPN, good compatibility of the reinforcement with the Matrix component and suitable processability. Typically, clay minerals are surface-modified with organic interface active compounds like detergents or silanes to obtain favorable properties as filler. They are incorporated into the polymer matrix using manufacturing Equipment like extruders, batch reactors or other mixing machines. In order for the surface modification to survive the stresses and strains during incorporation, the modified clay minerals must display sufficient thermal and mechanical stability to retain the compatibilizing effect. In the present study, thermogravimetry was used in combination with isoconversional kinetic analysis to determine the thermal stability of a silane-modified clay mineral based on bentonite. These findings were compared with the stability of the same clay mineral that was only surfactant-modified. It was found that silane modification leads to significantly improved thermal stability, which depends strongly on the type of silane employed.
The general conclusion of climate change studies is the necessity of eliminating net CO2 emissions in general and from the electric power systems in particular by 2050. The share of renewable energy is increasing worldwide, but due to the intermittent nature of wind and solar power, a lack of system flexibility is already hampering the further integration of renewable energy in some countries. In this study, we analyze if and how combinations of carbon pricing and power-to-gas (PtG) generation in the form of green power-to-hydrogen followed by methanation (which we refer to as PtG throughout) using captured CO2 emissions can provide transitions to deep decarbonization of energy systems. To this end, we focus on the economics of deep decarbonization of the European electricity system with the help of an energy system model. In different scenario analyses, we find that a CO2 price of 160 €/t (by 2050) is on its own not sufficient to decarbonize the electricity sector, but that a CO2 price path of 125 (by 2040) up to 160 €/t (by 2050), combined with PtG technologies, can lead to an economically feasible decarbonization of the European electricity system by 2050. These results are robust to higher than anticipated PtG costs.
The powder coating of veneered particle boards by the sequence electrostatic powder application -powder curing via hot pressing is studied in order to create high gloss surfaces. To obtain an appealingaspect, veneer Sheets were glued by heat and pressure on top of particle boards and the resulting surfaceswere used as carrier substrates for powder coat finishing. Prior to the powder coating, the veneeredparticle board surfaces were pre-treated by sanding to obtain good uniformity and the boards werestored in a climate chamber at controlled temperature and humidity conditions to adjust an appropriate electrical surface resistance. Characterization of surface texture was done by 3D microscopy. The surfaceelectrical resistance was measured for the six veneers before and after their application on the particleboard surface. A transparent powder top-coat was applied electrostatically onto the veneered particleboard surface. Curing of the powder was done using a heated press at 130◦C for 8 min and a smooth, glossy coating was obtained on the veneered surfaces. By applying different amounts of powder thecoating thickness could be varied and the optimum amount of powder was determined for each veneer type.
The wet chemical deposition of solution processed transparent conducting oxides (TCO) provides an alternative low cost and economical deposition technique to realize large-areas of conducting films. Since the price for the most common TCO Indium Tin Oxide rises enormously, Aluminum Zinc Oxide (AZO) as alternative TCO reaches more and more interest. The optoelectronical properties of nanoparticle coatings strongly depend beneath the porosity of the coating on the shape and size of the used particles. By using bigger or rod-shaped particles it is possible to minimize the amount of grain boundaries resulting in an improvement of the electrical properties, whereas particles bigger than 100 nm should not be used if highly transparent coatings are necessary as these big particles scatter the visible light and lower the transmittance of the coatings. In this work we present a simple method to synthesize AZO particles with different shape and size, but comparable electronical properties. We use a simple, well reproducible polyol method for synthesis and influence the shape and size of the particles by adding different amounts of water to the precursor solution. We can show that the addition of aluminum as dopant strongly hinders the crystal growth but the addition of water counteracts this, so that both, spherical and rod-shaped particles can be obtained.
Due to the complexity of assembly processes, a high ratio of tasks is still performed by human workers. Short-cyclically changing work contents due to smaller lot sizes, especially the varied series assesmbly, increases both the need for information support as well as the risk of rising physical and psychological stress. The use of technical and digital assistance systems can counter these challenges. Through the integration of information and communication technology as well as collaborative assembly technologies, hybrid cyber-physical assembly systems will emerge. Widely established assembly planning approaches for digital and technical support systems in cyber physical assembly systems will be outlined and discussed with regard to synergies and delimitations of planning perspectives.
Monitoring heart rate and breathing is essential in understanding the physiological processes for sleep analysis. Polysomnography (PSG) system have traditionally been used for sleep monitoring, but alternative methods can help to make sleep monitoring more portable in someone's home. This study conducted a series of experiments to investigate the use of pressure sensors placed under the bed as an alternative to PSG for monitoring heart rate and breathing during sleep. The following sets of experiments involved the addition of small rubber domes - transparent and black - that were glued to the pressure sensor. The resulting data were compared with the PSG system to determine the accuracy of the pressure sensor readings. The study found that the pressure sensor provided reliable data for extracting heart rate and respiration rate, with mean absolute errors (MAE) of 2.32 and 3.24 for respiration and heart rate, respectively. However, the addition of small rubber hemispheres did not significantly improve the accuracy of the readings, with MAEs of 2.3 bpm and 7.56 breaths per minute for respiration rate and heart rate, respectively. The findings of this study suggest that pressure sensors placed under the bed may serve as a viable alternative to traditional PSG systems for monitoring heart rate and breathing during sleep. These sensors provide a more comfortable and non-invasive method of sleep monitoring. However, the addition of small rubber domes did not significantly enhance the accuracy of the readings, indicating that it may not be a worthwhile addition to the pressure sensor system.
Context: Companies increasingly strive to adapt to market and ecosystem changes in real time. Gauging and understanding team performance in such changing environments present a major challenge.
Objective: This paper aims to understand how software developers experience the continuous adaptation of performance in a modern, highly volatile environment using Lean and Agile software development methodology. This understanding can be used as a basis for guiding formation and maintenance of high-performing teams, to inform performance improvement initiatives, and to improve working conditions for software developers.
Method: A qualitative multiple-case study using thematic interviews was conducted with 16 experienced practitioners in five organisations.
Results: We generated a grounded theory, Performance Alignment Work, showing how software developers experience performance. We found 33 major categories of performance factors and relationships between the factors. A cross-case comparison revealed similarities and differences between different kinds and different sizes of organisations.
Conclusions: Based on our study, software teams are engaged in a constant cycle of interpreting their own performance and negotiating its alignment with other stakeholders. While differences across organisational sizes exist, a common set of performance experiences is present despite differences in context variables. Enhancing performance experiences requires integration of soft factors, such as communication, team spirit, team identity, and values, into the overall development process. Our findings suggest a view of software development and software team performance that centres around behavioural and social sciences.
Rare but extreme events, such as pandemics, terror attacks, and stock market collapses, pose a risk that could undermine cooperation in societies and groups. We extend the public goods game (PGG) to investigate the relationship between rare but extreme external risks and cooperation in a laboratory experiment. By incorporating risk as an external random variable in the PGG, independent of the participants’ contributions, we preserve the economic equilibrium of non-cooperation in the original game. Furthermore, we examine whether cooperation can be restored by the relatively simple intervention of informing about countermeasures while keeping the actual risk constant. Our experimental results reveal that on average extreme risks indeed decrease contributions by about 20%; however, countermeasure information increases contributions by about 10%. Specifically, in the first interactions, cooperation levels can even reach those observed in the riskless baseline. Our results suggest that countermeasure information could help reinforce social cohesion and resilience in the face of rare but extreme risks.
A systematic study using a central composite design of experiments (DoE) was performed on the oxygen plasma surface modifications of two different polymers—Pellethane 2363-55DE, which is a polyurethane, and vinyltrimethoxysilane-grafted ethylene-propylene (EPR-g-VTMS), a cross-linked ethylene-propylene rubber. The impacts of four parameters—gas pressure, generator power, treatment duration, and process temperature—were assessed, with static contact angles and calculated surface free energies (SFEs) as the main responses in the DoE. The plasma effects on the surface roughness and chemistry were determined using scanning electron microscopy (SEM) and X-ray photoelectron spectroscopy (XPS). Through the sufficiently accurate DoE model evaluation, oxygen gas pressure was established as the most impactful factor, with the surface energy and polarity rising with falling oxygen pressure. Both polymers, though different in composition, exhibited similar modification trends in surface energy rise in the studied system. The SEM images showed a rougher surface topography after low pressure plasma treatments. XPS and subsequent multivariate data analysis of the spectra established that higher oxidized species were formed with plasma treatments at low oxygen pressures of 0.2 mbar.
The use of additive manufacturing technologies for industrial production is constantly growing. This technology differs from the known production proecdures. The areas for scheduling, detailed and sequence planning are particularly important for additive production due to the long print times and flexible use of the production area. Therefore, production-relevant variables are considered and used for the production planning and control (PPC) of additive manufacturing machines. For this purpose, an optimization model is presented which shows a time-oriented build space utilization. In the implementation, a nesting algorithm is used to check the combinability of different models for each individual print job.
This paper generalizes the theory of policy uncertainty with the new literature on rational inattention. First, the model demonstrates that inattention is dependent on the signal variance and the policy parameter. Second, I discover a novel trade-off showing that a policy instrument mitigates attention. Third, the policy instrument is non-linear and reciprocal to both the size and variance of the signal. The unifying theory creates new implications to economic theory and public policy alike.
Conventional production systems are evolving through cyber-physical systems and application-oriented approaches of AI, more and more into "smart" production systems, which are characterized among other things by a high level of communication and integration of the individual components. The exchange of information between the systems is usually only oriented towards the data content, where semantics is usually only implicitly considered. The adaptability required by external and internal influences requires the integration of new or the redesign of existing components. Through an open application-oriented ontology the information and communication exchange are extended by explicit semantic information. This enables a better integration of new and an easier reconfiguration of existing components. The developed ontology, the derived application and use of the semantic information will be evaluated by means of a practical use case.
Wave-like differential equations occur in many engineering applications. Here the engineering setup is embedded into the framework of functional analysis of modern mathematical physics. After an overview, the –Hilbert space approach to free Euler–Bernoulli bending vibrations of a beam in one spatial dimension is investigated. We analyze in detail the corresponding positive, selfadjoint differential operators of 4-th order associated to the boundary conditions in statics. A comparison with free string wave swinging is outlined.
On the design of an urban data and modeling platform and its application to urban district analyses
(2020)
An integrated urban platform is the essential software infrastructure for smart, sustainable and resilitent city planning, operation and maintenance. Today such platforms are mostly designed to handle and analyze large and heterogeneous urban data sets from very different domains. Modeling and optimization functionalities are usually not part of the software concepts. However, such functionalities are considered crucial by the authors to develop transformation scenarios and to optimized smart city operation. An urban platform needs to handle multiple scales in the time and spatial domain, ranging from long term population and land use change to hourly or sub-hourly matching of renewable energy supply and urban energy demand.
The fiber deformations of once-dried, bleached and never-dried unbleached kraft pulps were studied with respect to their behavior in high- and low-consistency refining. The pulps were stained with congo red to experimentally highlight areas where the arrangement of the fibrils was altered by refining such as dislocated zones or slip planes. The stained fibers were analyzed with conventional Metso Fiberlab but also with a novel prototype measurement device utilizing a color imaging setup. The local intensity of the stain in the fiber was expressed as degree of overall damage (Overall fiber damage index, OFDI). The rewetted zero span tensile index (RWZSTI) was used to verify the OFDI with respect to the pulp strength. High consistency refining resulted in a clear increase in the number of kinks which negatively influenced the pulp strength. The OFDI which was used to detect the intensity of local fiber defects also responded accordingly. A higher OFDI resulted in a lower pulp strength. Low consistency refining removed a significant amount of kinks and resulted in an increase in fiber swelling. A slight increase in fibrillation and a significant increase in flake-like fines were also observed. The OFDI, however, was not reduced in low consistency refining as it would be expected by the removal of less severe dislocations. One reason proposed here is that low consistency refining created new fiber pores that allowed the dye to penetrate into the fiber wall similarly as it does in the zones of the dislocations.
Respiratory diseases are leading causes of death and disability in the world. The recent COVID-19 pandemic is also affecting the respiratory system. Detecting and diagnosing respiratory diseases requires both medical professionals and the clinical environment. Most of the techniques used up to date were also invasive or expensive.
Some research groups are developing hardware devices and techniques to make possible a non-invasive or even remote respiratory sound acquisition. These sounds are then processed and analysed for clinical, scientific, or educational purposes.
We present the literature review of non-invasive sound acquisition devices and techniques.
The results are about a huge number of digital tools, like microphones, wearables, or Internet of Thing devices, that can be used in this scope.
Some interesting applications have been found. Some devices make easier the sound acquisition in a clinic environment, but others make possible daily monitoring outside that ambient. We aim to use some of these devices and include the non-invasive recorded respiratory sounds in a Digital Twin system for personalized health.
This paper is concerned with the study, optimization and control of the moisture sorption kinetics of agricultural products at temperatures typically found in processing and storage. A nonlinear autoregressive with exogenous inputs (NARX) neural network was developed to predict moisture sorption kinetics and consequently equilibrium moisture contents of shiitake mushrooms (Lentinula edodes (Berk.) Pegler) over a wide range of relative humidity and different temperatures. Sorption kinetic data of mushroom caps was separately generated using a continuous, gravimetric dynamic vapour sorption analyser at emperatures of 25-40 °C over a stepwise variation of relative humidity ranging from 0 to 85%. The predictive power of the neural network was based on physical data, namely relative humidity and temperature. The model was fed with a total of 4500 data points by dividing them into three subsets, namely, 70% of the data was used for training, 15% of the data for testing and 15% of the data for validation, randomly selected from the whole dataset. The NARX neural network was capable of precisely simulating equilibrium moisture contents of mushrooms derived from the dynamic vapour sorption kinetic data throughout the entire range of relative humidity.
Nanocoatings based on sol–gel coatings are presented as suitable tool to modify materials based on polymers. The main focus is set onto textiles as the most common polymer materials. It presents which types of functionalization can be reached by modified sol–gel processes. Also a suitable categorization of functions is given and set into relation to common applications. A special focus is set on the functional properties, antimicrobial, UV protective, and flame retardant. The concept of bifunctional coatings is discussed and especially the combination of water-repellent and antistatic is presented.
Thin radio-frequency magnetron sputter deposited nano-hydroxyapatite (HA) films were prepared on the surface of a Fe-tricalcium phosphate (Fe-TCP) bioceramic composite, which was obtained using a conventional powder injection moulding technique. The obtained nano-hydroxyapatite coated Fe-TCP biocomposites (nano HA-Fe-TCP) were studied with respect to their chemical and phase composition, surface morphology, water contact angle, surface free energy and hysteresis. The deposition process resulted in a homogeneous, single-phase HA coating. The ability of the surface to support adhesion and the proliferation of human mesenchymal stem cells (hMSCs) was studied using biological short-term tests in vitro. The surface of the uncoated Fe-TCP bioceramic composite showed an initial cell attachment after 24 h of seeding, but adhesion, proliferation and growth did not persist during 14 days of culture.However, the HA-Fe-TCP surfaces allowed cell adhesion, and proliferation during 14 days. The deposition of the nano-HA films on the Fe-TCP surface resulted in higher surface energy, improved hydrophilicity and biocompatibility compared with the surface of the uncoated Fe-TCP. Furthermore, it is suggested that an increase in the polar component of the surface energy was responsible for the enhanced cell adhesion and proliferation in the case of the nano-HA Fe-TCP biocomposites.
The isothermal curing of melamine resin is investigated by in-line infrared spectroscopy at different temperatures. The infrared spectra are decomposed into time courses of characteristic spectral patterns using Multivariate Curve Resolution (MCR). It was found that depending on the applied curing temperature, melamine films with different spectral fingerprints and correspondingly different chemical network structures are formed. The network structures of fully cured resin films are specific for the applied curing temperatures used and cannot simply be compensated by changes in the curing time. For industrial curing processes, this means that cure temperature is the main system determining factor at constant M:F ratio. However, different MF resin networks can be specifically obtained from one and the same melamine resin by suitable selection of the curing time and temperatures profiles to design resin functionality. The spectral fingerprints after short curing time as well as after long curing time reflect the fundamental differences in the thermoset networks that can be obtained with industrial short-cycle and multi-daylight presses.
Teaching at assembly workstations in production in SMEs (small and medium sized companies) often does not take place at all or only insufficiently. In addition to the lack of technical content, there are also aggravatingly incorrect movement sequences from an ergonomic point of view, which "untrained" people usually automatically acquire. An AI based approach is used to analyze a definite workflow for a specific assembly scope regarding the behavior of several employees. Based on these different behaviors, the AI gives feedback at which points in time, work steps and movement’s particularly dangerous incorrect postures occur. Motion capturing and digital human model simulation in combination with the results of the AI define the optimized workflow. Individual employees can be trained directly due to the fact that AI identifies their most serious incorrect postures and provide them with a direct analogy of their “wrong” posture and “easy on the joints posture”. With the assistance of various test persons, the AI can conduct a study in which the most frequently occurring incorrect postures can be identified. This could be realized in general or tailored to specific groups of people (e.g. "People over 1.90m tall must be particularly careful not to make the following mistake...). The approach will be tested and validated at the Werk150, the factory of the ESB Business School, on the campus of the Reutlingen University. The new gained knowledge will be used subsequently for training in SMEs.
Model-based hearing diagnosis based on wideband tympanometry measurements utilizing fuzzy arithmetic
(2019)
Today's audiometric methods for the diagnosis of middle ear disease are often based on a comparison of measurements with standard curves that represent the statistical range of normal hearing responses. Because of large inter-individual variances in the middle ear, especially in wideband tympanometry (WBT), specificity and quantitative evaluation are greatly restricted. A new model-based approach could transform today's predominantly qualitative hearing diganostics into a quantitative and tailored, patient-specific diagnosis, by evaluating WBT measurements with the aid of a middle-ear model. For this particular investigation, a finite element model of a human ear was used. It consisted of an acoustic ear canal and a tympanic cavity model, a middle-ear with detailed nonlinear models of the tympanic membrane and annular ligament, and a simplified inner-ear model. This model has made it possible to identify pathologies from measurements, by analyzing the parameters through senstivity studies and parameter clustering. Uncertainties due to the lack of knowledge, subjectivity in numerical implementation and model simplification are taken into account by the application of fuzzy arithmetic. The most confident parameter set can be determined by applying an inverse fuzzy method on the measurement data. The principle and the benefits of this model-based approach are illustrated by the example of a two-mass oscillator, and also by the simulation of the energy absorbance of an ear with malleus fixation, where the parameter changes that are introduced can be determined quantitatively through the system identification.
Human bestrophin-1 (hBest1) is a transmembrane Ca2+- dependent anion channel, associated with the transport of Cl−, HCO3- ions, γ-aminobutiric acid (GABA), glutamate (Glu), and regulation of retinal homeostasis. Its mutant forms cause retinal degenerative diseases, defined as Bestrophinopathies. Using both physicochemical - surface pressure/mean molecular area (π/A) isotherms, hysteresis, compressibility moduli of hBest1/sphingomyelin (SM) monolayers, Brewster angle microscopy (BAM) studies, and biological approaches - detergent membrane fractionation, Laurdan (6-dodecanoyl-N,N-dimethyl-2-naphthylamine) and immunofluorescence staining of stably transfected MDCK-hBest1 and MDCK II cells, we report:
1) Ca2+, Glu and GABA interact with binary hBest1/SM monolayers at 35 °C, resulting in changes in hBest1 surface conformation, structure, self-organization and surface dynamics. The process of mixing in hBest1/SM monolayers is spontaneous and the effect of protein on binary films was defined as “fluidizing”, hindering the phase-transition of monolayer from liquid-expanded to intermediate (LE-M) state;
2) in stably transfected MDCK-hBest1 cells, bestrophin-1 was distributed between detergent resistant (DRM) and detergent-soluble membranes (DSM) - up to 30 % and 70 %, respectively; in alive cells, hBest1 was visualized in both liquid-ordered (Lo) and liquid-disordered (Ld) fractions, quantifying protein association up to 35 % and 65 % with Lo and Ld. Our results indicate that the spontaneous miscibility of hBest1 and SM is a prerequisite to diverse protein interactions with membrane domains, different structural conformations and biological functions.
The present work proposes the use of modern ICT technologies such as smartphones, NFCs, internet, and web technologies, to help patients in carrying out their therapies. The implemented system provides a calendar with a reminder of the assumptions, ensures the drug identification through NFC, allows remote assistance from healthcare staff and family members to check and manage the therapy in real-time. The system also provides centralized information on the patient's therapeutic situation, helpful in choosing new compatible therapies.
The fifth mobile communications generation (5G) offers the deployment scenario of licensed 5G standalone non-public networks (NPNs). Standalone NPNs are locally restricted 5G networks based on 5G New Radio technology which are fully isolated from public networks. NPNs operate on their dedicated core network and offer organizations high data security and customizability for intrinsic network control. Especially in networked and cloud manufacturing, 5G is seen as a promising enabler for delay-sensitive applications such as autonomous mobile robots and robot motion control based on the tactile internet that requires wireless communication with deterministic traffic and strict cycling times. However, currently available industrial standalone NPNs do not meet the performance parameters defined in the 5G specification and standardization process. Current research lacks in performance measurements of download, upload, and time delays of 5G standalone-capable end-devices in NPNs with currently available software and hardware in industrial settings. Therefore, this paper presents initial measurements of the data rate and the round-trip delay in standalone NPNs with various end-devices to generate a first performance benchmark for 5G-based applications. In addition, five end-devices are compared to gain insights into the performance of currently available standalone-capable 5G chipsets. To validate the data rate, three locally hosted measurement methods, namely iPerf3, LibreSpeed and OpenSpeedTest, are used. Locally hosted Ping and LibreSpeed have been executed to validate the time delay. The 5G standalone NPN of Reutlingen University uses licensed frequencies between 3.7-3.8 GHz and serves as the testbed for this study.
The use of deep learning models with medical data is becoming more widespread. However, although numerous models have shown high accuracy in medical-related tasks, such as medical image recognition (e.g. radiographs), there are still many problems with seeing these models operating in a real healthcare environment. This article presents a series of basic requirements that must be taken into account when developing deep learning models for biomedical time series classification tasks, with the aim of facilitating the subsequent production of the models in healthcare. These requirements range from the correct collection of data, to the existing techniques for a correct explanation of the results obtained by the models. This is due to the fact that one of the main reasons why the use of deep learning models is not more widespread in healthcare settings is their lack of clarity when it comes to explaining decision making.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Machine failures’ consequences – a classification model considering ultra-efficiency criteria
(2023)
To strive for a sustainable production, maintenance has to evaluate possible machine failure consequences not just economically but also holistically. Approaches such as the ultra-efficiency factory consider energy, material, human/staff, emission, and organization as optimization dimensions. These ultra-efficiency dimensions can be considered for analyzing not only the respective machine failure but also the effects on the entire production system holistically. This paper presents an easy to use method, based on a questionnaire, for assessing the failure consequences of a machine malfunction in a production system considering the ultra-efficiency dimensions. The method was validated in a battery production.
Strategy to adjust people’s performance capabilities to new requirements and grantee employability in the world of work. Good examples for this are the current changes in the logistics environment. Regularly, new services and processes close to production were taken into the portfolio of logistics enterprises, so the daily Tasks are changing continuously for the skilled works.
LOPEC aims in developing and offering special-tailored training for Lean Logistics and required basic skills for skilled workers on shopfloor level. Needed know-how for today’s challenges in logistics will be transferred. Another aspect of LOPEC is the development and use of a personal excellence self-assessment that allows a Person to assess and thus improve his/her own level of maturity in employability skills. Thus, LOPEC is aiming at People ehancement as entry ticket to lifelong continuous learning by increasing the maturity level of personal logistic excellence. A common European view for “Logistics personal excellence” for skilled workers will ensure that the final product is an open product, using international, pan European validated standards. As results LOPEC will provide training modules for post-secondary education in the area of Lean Logistics, required basics skills and offers transparency of personal excellence with a personal self-assessment Software solution, regarding the personal maturity Level of hard and soft skills at any time. It can be used as an innovative tool for monitoring personal lifelong learning routes as well as within companies as a strategic tool within Human Resource Development.
Learning factories on demand
(2021)
Learning Factories are research and learning environments that demonstrate new concepts and technologies for the industry in a practical environment. The interaction between physical and virtual components is a central aspect. The mediation and presentation usually occur directly in the learning factory and are thus limited in time and concerning the user group. A learning factory- on-demand- can be provided by dividing and virtualizing the individual components via containers and microservices. This enables both local operation and operation hybrid cloud or cloud systems. Physical components can be mapped either through standardized interfaces or suitable emulators. Using the example of the Learning Factory at Reutlingen University (Werk150), it will be shown how different use cases can be made available utilizing software-based orchestration, thus promoting broader and more independent teaching.
In the last decade, numerous learning factories for education, training, and research have been built up in industry and academia. In recent years learning factory initiatives were elevated from a local to a European and then to a worldwide level. In 2014 the CIRP Collaborative Working Group (CWG) on Learning Factories enables a lively exchange on the topic "Learning Factories for future oriented research and education in manufacturing". In this paper results of discussions inside the CWG are presented. First, what is meant by the term Learning Factory is outlined. Second, based on the definition a description model (morphology) for learning factories is presented. The morphology covers the most relevant characteristics and features of learning factories in seven dimensions. Third, following the morphology the actual variance of learning factory manifestations is shown in six learning factory application scenarios from industrial training over education to research. Finally, future prospects of the learning factory concept are presented.
Learning factories present a promising environment for education, training and research, especially in manufacturing related areas which are a main driver for wealth creation in any nation. While numerous learning factories have been built in industry and academia in the last decades, a comprehensive scientific overview of the topic is still missing. This paper intends to close this gap establishing the state of the art of learning factories. The motivations, historic background, and the didactic foundations of learning factories are outlined. Definitions of the term learning factory and the corresponding morphological model are provided. An overview of existing learning factory approaches in industry and academia is provided, showing the broad range of different applications and varying contents. The state of the art of learning factories curricula design and their use to enhance learning and research as well as potentials and limitations are presented. Conclusions and an outlook on further research priorities are offered.
Herein, biochar from biomass residues is demonstrated as active materials for the catalytic cracking of waste motor oil into diesel-like fuels. Above all, alkali-treated rice husk biochar showed great activity with a 250% increase in the kinetic constant compared to the thermal cracking. It also showed better activity than synthetic materials, as previously reported. Moreover, much lower activation energy (185.77 to 293.48 kJ/mol) for the cracking process was also obtained. According to materials characterization, the catalytic activity was more related to the nature of the biochar’s surface than its specific surface area. Finally, liquid products complied with all the physical properties defined by international standards for diesel-like fuels, with the presence of hydrocarbons chains between C10 - C27 similar to the ones obtained in commercial diesel.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
Introducing continuous experimentation in large software-intensive product and service organisations
(2017)
Software development in highly dynamic environments imposes high risks to development organizations. One such risk is that the developed software may be of only little or no value to customers, wasting the invested development efforts.Continuous experiment ation, as an experiment-driven development approach, may reduce such development risks by iteratively testing product and service assumptions that are critical to the success of the software. Although several experiment-driven development approaches are available, there is little guidance available on how to introduce continuous experimentation into an organization. This article presents a multiple-case study that aims at better understanding the process of introducing continuous experimentation into an organization with an already established development process. The results from the study show that companies are open to adopting such an approach and learning throughout the introduction process. Several benefits were obtained, such as reduced development efforts, deeper customer insights, and better support for development decisions. Challenges included complex stakeholder structures, difficulties in defining success criteria, and building experimen- tation skills. Our findings indicate that organizational factors may limit the benefits of experimentation. Moreover, introducing continuous experimentation requires fundamental changes in how companies operate, and a systematic introduction process can increase the chances of a successful start.
The interaction between lipid bilayers in water has been intensively studied over the last decades. Osmotic stress was applied to evaluate the forces between two approaching lipid bilayers in aqueous solution. The force–distance relation between lipid mono- or bilayers deposited on mica sheets using a surface force apparatus (SFA) was also measured. Lipid stabilised foam films offer another possibility to study the interactions between lipid monolayers. These films can be prepared comparatively easy with very good reproducibility. Foam films consist usually of two adsorbed surfactant monolayers separated by a layer of the aqueous solution from which the film is created. Their thickness can be conveniently measured using microinterferometric techniques. Studies with foam films deliver valuable information on the interactions between lipid membranes and especially their stability and permeability. Presenting inverse black lipid membrane (BLM) foam films supply information about the properties of the lipid self-organisation in bilayers. The present paper summarises results on microscopic lipid stabilised foam films by measuring their thickness and contact angle. Most of the presented results concern foam films prepared from dispersions of the zwitterionic lipid 1,2-dimyristoyl-sn-glycero-3-phosphorylcholine (DMPC) and some of its mixtures with the anionic lipid — 1,2-dimyristoyl-sn-glycero-3-[phospho-rac-(1-glycerol)] (DMPG).
The strength of the long range and short range forces between the lipid layers is discussed. The van der Waals attractive force is calculated. The electrostatic repulsive force is estimated from experiments at different electrolyte concentrations (NaCl, CaCl2) or by modification of the electrostatic double layer surface potential by incorporating charged lipids in the lipid monolayers. The short range interactions are studied and modified by using small carbohydrates (fructose and sucrose), ethanol (EtOH) or dimethylsulfoxide (DMSO). Some results are compared with the structure of lipid monolayers deposited at the liquid/air interface (monolayers spread in Langmuir trough), which are one of most studied biomembrane model system. The comparison between the film thickness and the free energy of film formation is used to estimate the contribution of the different components of the disjoining pressure to the total interaction in the film and their dependence on the composition of the film forming solution.