Refine
Year of publication
- 2021 (234) (remove)
Document Type
- Journal article (135)
- Conference proceeding (73)
- Book chapter (13)
- Report (5)
- Doctoral Thesis (3)
- Issue of a journal (2)
- Working Paper (2)
- Book (1)
Has full text
- yes (234) (remove)
Is part of the Bibliography
- yes (234)
Institute
- ESB Business School (86)
- Informatik (69)
- Life Sciences (40)
- Technik (28)
- Texoversum (5)
- Zentrale Einrichtungen (4)
Publisher
- Springer (27)
- MDPI (24)
- Elsevier (20)
- IEEE (16)
- Springer Gabler (10)
- Wiley (9)
- De Gruyter (7)
- Hochschule Reutlingen (6)
- ACS (5)
- SSRN (5)
The physicochemical properties of synthetically produced bone substitute materials (BSM) have a major impact on biocompatibility. This affects bony tissue integration, osteoconduction, as well as the degradation pattern and the correlated inflammatory tissue responses including macrophages and multinucleated giant cells (MNGCs). Thus, influencing factors such as size, special surface morphologies, porosity, and interconnectivity have been the subject of extensive research. In the present publication, the influence of the granule size of three identically manufactured bone substitute granules based on the technology of hydroxyapatite (HA)-forming calcium phosphate cements were investigated, which includes the inflammatory response in the surrounding tissue and especially the induction of MNGCs (as a parameter of the material degradation). For the in vivo study, granules of three different size ranges (small = 0.355–0.5 mm; medium = 0.5–1 mm; big = 1–2 mm) were implanted in the subcutaneous connective tissue of 45 male BALB/c mice. At 10, 30, and 60 days post implantationem, the materials were explanted and histologically processed. The defect areas were initially examined histopathologically. Furthermore, pro- and anti-inflammatory macrophages were quantified histomorphometrically after their immunohistochemical detection. The number of MNGCs was quantified as well using a histomorphometrical approach. The results showed a granule size-dependent integration behavior. The surrounding granulation tissue has passivated in the groups of the two bigger granules at 60 days post implantationem including a fibrotic encapsulation, while a granulation tissue was still present in the group of the small granules indicating an ongoing cell-based degradation process. The histomorphometrical analysis showed that the number of proinflammatory macrophages was significantly increased in the small granules at 60 days post implantationem. Similarly, a significant increase of MNGCs was detected in this group at 30 and 60 days post implantationem. Based on these data, it can be concluded that the integration and/or degradation behavior of synthetic bone substitutes can be influenced by granule size.
Hyperspectral imaging and reflectance spectroscopy in the range from 200–380 nm were used to rapidly detect and characterize copper oxidation states and their layer thicknesses on direct bonded copper in a non-destructive way. Single-point UV reflectance spectroscopy, as a well-established method, was utilized to compare the quality of the hyperspectral imaging results. For the laterally resolved measurements of the copper surfaces an UV hyperspectral imaging setup based on a pushbroom imager was used. Six different types of direct bonded copper were studied. Each type had a different oxide layer thickness and was analyzed by depth profiling using X-ray photoelectron spectroscopy. In total, 28 samples were measured to develop multivariate models to characterize and predict the oxide layer thicknesses. The principal component analysis models (PCA) enabled a general differentiation between the sample types on the first two PCs with 100.0% and 96% explained variance for UV spectroscopy and hyperspectral imaging, respectively. Partial least squares regression (PLS-R) models showed reliable performance with R2c = 0.94 and 0.94 and RMSEC = 1.64 nm and 1.76 nm, respectively. The developed in-line prototype system combined with multivariate data modeling shows high potential for further development of this technique towards real large-scale processes.
Fast pyrolysis as a valorization mechanism for banana rachis and low-density polyethylene waste
(2021)
Banana rachis and low-density polyethylene (LDPE) were selected as secondary feedstocks for the study of fast pyrolysis in a free-fall reactor. The experiments were performed at 600 °C for banana rachis and 450 °C for LDPE, based on literature and thermogravimetric analysis. The gaseous products of both feedstocks present similar composition in the C1-C2 compounds, while C3 compounds are only found in LDPE. The liquid products from banana and LDPE correspond to functional groups and shorter hydrocarbons, respectively. Scanning electron microscopy (SEM) and Fourier transform infrared (FTIR) analyses of the char showed important morphological changes to spheres in LDPE and structural changes due to thermal decomposition in the biomass. The pyrolysis char has high potential as adsorbent, encapsulation, or catalyst.
While there has been increased digitization of private homes, only little has been done to understand these specific home technologies, how they serve consumers, among other issues. “Smart home technology” (SHT) refer to a wide range of artifacts from cleaning aids to energy advisors. Given this breadth, clarity surrounding the key characteristics and the multi-faceted impact of SHT is needed to conduct more directed research on SHT. We propose a taxonomy to help outline the salient intended outcomes of SHT. Through a process involving five iterations, we analyzed and classified 79 technologies (gathered from literature and industry reports). This uncovered seven dimensions encompassing 20 salient characteristics. We believe these dimensions/characteristics will help researchers and organizations better design and study the impacts of these technologies. Our long-term agenda is to use the proposed taxonomy for an exploratory inquiry to understand tensions occurring when personal and sustainability-related outcomes compete.
Sichtprüfungen von Produktoberflächen werden überwiegend von Mitarbeitern ausgeführt, wobei Automatisierungsansätze mit Kamera- und Bildverarbeitungssystemen großes Potenzial zeigen. Auch Cobots werden in Qualitätssicherungsprozesse einbezogen.Im Folgenden werden die Integrationsmöglichkeiten von Cobots in die Sichtprüfung diskutiert und ein Entscheidungsmodell dargestellt, mit dem Sichtprüfungsprozesse auf ihre Cobot-Tauglichkeit überprüft werden können. Das Entscheidungsmodell ist für die direkte Integration in bereits existierende Cobot-Eignungsuntersuchungsverfahren konzipiert und dient als erste strategische Entscheidungshilfe.
Für die digitale 3D-VR-Fabrikplanung sind unterschiedliche Soft- und Hardwaresysteme am Markt verfügbar, die teilweise erhebliche Kompatibilitätsprobleme aufweisen. Für die Bewertung der Hardwareeignung für die 3D-VR-Fabrikplanung wird ein Bewertungssystem vorgestellt, das anhand konkreter Softwareapplikationen und einem passiven 3D-Stereo-Monitor mit Head-Tracking erläutert wird. Es wird dazu auch die Notwendigkeit des Einsatzes von Software-Middleware zur Nutzungssteigerung diskutiert.
In this paper, we propose a radical new approach for scale-out distributed DBMSs. Instead of hard-baking an architectural model, such as a shared-nothing architecture, into the distributed DBMS design, we aim for a new class of so-called architecture-less DBMSs. The main idea is that an architecture-less DBMS can mimic any architecture on a per-query basis on-the-fly without any additional overhead for reconfiguration. Our initial results show that our architecture-less DBMS AnyDB can provide significant speedup across varying workloads compared to a traditional DBMS implementing a static architecture.
The early detection of head and neck cancer is a prolonged challenging task. It requires a precise and accurate identification of tissue alterations as well as a distinct discrimination of cancerous from healthy tissue areas. A novel approach for this purpose uses microspectroscopic techniques with special focus on hyperspectral imaging (HSI) methods. Our proof-of-principle study presents the implementation and application of darkfield elastic light scattering spectroscopy (DF ELSS) as a non-destructive, high-resolution, and fast imaging modality to distinguish lingual healthy from altered tissue regions in a mouse model. The main aspect of our study deals with the comparison of two varying HSI detection principles, which are a point-by-point and line scanning imaging, and whether one might be more appropriate in differentiating several tissue types. Statistical models are formed by deploying a principal component analysis (PCA) with the Bayesian discriminant analysis (DA) on the elastic light scattering (ELS) spectra. Overall accuracy, sensitivity, and precision values of 98% are achieved for both models whereas the overall specificity results in 99%. An additional classification of model-unknown ELS spectra is performed. The predictions are verified with histopathological evaluations of identical HE-stained tissue areas to prove the model’s capability of tissue distinction. In the context of our proof-of-principle study, we assess the Pushbroom PCA-DA model to be more suitable for tissue type differentiations and thus tissue classification. In addition to the HE-examination in head and neck cancer diagnosis, the usage of HSI-based statistical models might be conceivable in a daily clinical routine.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Maintenance is an increasingly complex and knowledge-intensive field. In order to address these challenges, assistance systems based on augmented, mixed, or virtual reality can be applied. Therefore, the objective of this paper is to present a framework that can be used to identify, select, and implement an assistance system based on reality technology in the maintenance environment. The development of the framework is based on a systematic literature review and subject matter expert interviews. The framework provides the best technological and economic solution in several steps. The validation of the framework is carried out through a case study.
Purpose
Injury or inflammation of the middle ear often results in the persistent tympanic membrane (TM) perforations, leading to conductive hearing loss (HL). However, in some cases the magnitude of HL exceeds that attributable by the TM perforation alone. The aim of the study is to better understand the effects of location and size of TM perforations on the sound transmission properties of the middle ear.
Methods
The middle ear transfer functions (METF) of six human temporal bones (TB) were compared before and after perforating the TM at different locations (anterior or posterior lower quadrant) and to different degrees (1 mm, ¼ of the TM, ½ of the TM, and full ablation). The sound-induced velocity of the stapes footplate was measured using single-point laser-Doppler-vibrometry (LDV). The METF were correlated with a Finite Element (FE) model of the middle ear, in which similar alterations were simulated.
Results
The measured and calculated METF showed frequency and perforation size dependent losses at all perforation locations. Starting at low frequencies, the loss expanded to higher frequencies with increased perforation size. In direct comparison, posterior TM perforations affected the transmission properties to a larger degree than anterior perforations. The asymmetry of the TM causes the malleus-incus complex to rotate and results in larger deflections in the posterior TM quadrants than in the anterior TM quadrants. Simulations in the FE model with a sealed cavity show that small perforations lead to a decrease in TM rigidity and thus to an increase in oscillation amplitude of the TM mainly above 1 kHz.
Conclusion
Size and location of TM perforations have a characteristic influence on the METF. The correlation of the experimental LDV measurements with an FE model contributes to a better understanding of the pathologic mechanisms of middle-ear diseases. If small perforations with significant HL are observed in daily clinical practice, additional middle ear pathologies should be considered. Further investigations on the loss of TM pretension due to perforations may be informative.
This article studies the effects of reverse factoring in a supply chain when the buyer company facilitates its lower short-term borrowing rates to the supplier corporation in return for extended payment terms. We explore the role of interest rate changes, rating changes, and the business cycle position on the cost and benefit trade-off from a supplier perspective. We utilize a combined empirical approach consisting of an event study in Step 1 and a simulation model in Step 2. The event study identifies the quantitative magnitude of central bank decisions and rating changes on the interest rate differential. The simulation computes with a rolling-window methodology the daily cost and benefits of reverse factoring from 2010 to 2018 under the assumption of the efficient market hypothesis. Our major finding is that changes of crucial financial variables such as interest rates, ratings, or news alerts will turn former win-win into win-lose situations for the supplier contingent to the business cycle. Overall, our results exhibit sophisticated trade-offs under reverse factoring and consequently require a careful evaluation in managerial decisions.
A full understanding of the relationship between surface properties, protein adsorption, and immune responses is lacking but is of great interest for the design of biomaterials with desired biological profiles. In this study, polyelectrolyte multilayer (PEM) coatings with gradient changes in surface wettability were developed to shed light on how this impacts protein adsorption and immune response in the context of material biocompatibility. The analysis of immune responses by peripheral blood mononuclear cells to PEM coatings revealed an increased expression of proinflammatory cytokines tumor necrosis factor (TNF)-α, macrophage inflammatory protein (MIP)-1β, monocyte chemoattractant protein (MCP)-1, and interleukin (IL)-6 and the surface marker CD86 in response to the most hydrophobic coating, whereas the most hydrophilic coating resulted in a comparatively mild immune response. These findings were subsequently confirmed in a cohort of 24 donors. Cytokines were produced predominantly by monocytes with a peak after 24 h. Experiments conducted in the absence of serum indicated a contributing role of the adsorbed protein layer in the observed immune response. Mass spectrometry analysis revealed distinct protein adsorption patterns, with more inflammation-related proteins (e.g., apolipoprotein A-II) present on the most hydrophobic PEM surface, while the most abundant protein on the hydrophilic PEM (apolipoprotein A-I) was related to anti-inflammatory roles. The pathway analysis revealed alterations in the mitogen-activated protein kinase (MAPK)-signaling pathway between the most hydrophilic and the most hydrophobic coating. The results show that the acute proinflammatory response to the more hydrophobic PEM surface is associated with the adsorption of inflammation-related proteins. Thus, this study provides insights into the interplay between material wettability, protein adsorption, and inflammatory response and may act as a basis for the rational design of biomaterials.
Conventional production systems are evolving through cyber-physical systems and application-oriented approaches of AI, more and more into "smart" production systems, which are characterized among other things by a high level of communication and integration of the individual components. The exchange of information between the systems is usually only oriented towards the data content, where semantics is usually only implicitly considered. The adaptability required by external and internal influences requires the integration of new or the redesign of existing components. Through an open application-oriented ontology the information and communication exchange are extended by explicit semantic information. This enables a better integration of new and an easier reconfiguration of existing components. The developed ontology, the derived application and use of the semantic information will be evaluated by means of a practical use case.
Learning factories on demand
(2021)
Learning Factories are research and learning environments that demonstrate new concepts and technologies for the industry in a practical environment. The interaction between physical and virtual components is a central aspect. The mediation and presentation usually occur directly in the learning factory and are thus limited in time and concerning the user group. A learning factory- on-demand- can be provided by dividing and virtualizing the individual components via containers and microservices. This enables both local operation and operation hybrid cloud or cloud systems. Physical components can be mapped either through standardized interfaces or suitable emulators. Using the example of the Learning Factory at Reutlingen University (Werk150), it will be shown how different use cases can be made available utilizing software-based orchestration, thus promoting broader and more independent teaching.
In a networked world, companies depend on fast and smart decisions, especially when it comes to reacting to external change. With the wealth of data available today, smart decisions can increasingly be based on data analysis and be supported by IT systems that leverage AI. A global pandemic brings external change to an unprecedented level of unpredictability and severity of impact. Resilience therefore becomes an essential factor in most decisions when aiming at making and keeping them smart. In this chapter, we study the characteristics of resilient systems and test them with four use cases in a wide-ranging set of application areas. In all use cases, we highlight how AI can be used for data analysis to make smart decisions and contribute to the resilience of systems.
Prior to the introduction of AI-based forecast models in the procurement department of an industrial retail company, we assessed the digital skills of the procurement employees and surveyed their attitudes toward a new digital technology. The aim of the survey was to ascertain important contextual factors which are likely to influence the acceptance and the successful use of the new forecast tool. What we find is that the digital skills of the employees show an intermediate level and that their attitudes toward key aspects of new digital technologies are largely positive. Thus, the conditions for high acceptance and the successful use of the models are good, as evidenced by the high intention of the procurement staff to use the models. In line with previous research, we find that the perceived usefulness of a new technology and the perceived ease of use are significant drivers of the willingness to use the new forecast tool.
Autonomisierung von Shopfloor Management : Der Weg vom analogen zum autonomen Shopfloor Management
(2021)
Neue Technologien der Digitalisierung, Vernetzung und künstlichen Intelligenz werden zunehmend auch im Shopfloor Management (SFM) Einzug halten. Dieser Beitrag beschreibt in vier Stufen, wie sich das klassische SFM über das digitale SFM hin zu einem smarten und autonomen SFM entwickeln könnte. Darauf aufbauend wird diskutiert, welche Auswirkungen der Einsatz dieser neuen Technologien auf die operative Gestaltung der Durchführung eines SFM hätte und welche Konsequenzen somit auf Mitarbeiter und Führungskräfte zukommen würden.*)
Context
Microservices as a lightweight and decentralized architectural style with fine-grained services promise several beneficial characteristics for sustainable long-term software evolution. Success stories from early adopters like Netflix, Amazon, or Spotify have demonstrated that it is possible to achieve a high degree of flexibility and evolvability with these systems. However, the described advantageous characteristics offer no concrete guidance and little is known about evolvability assurance processes for microservices in industry as well as challenges in this area. Insights into the current state of practice are a very important prerequisite for relevant research in this field.
Objective
We therefore wanted to explore how practitioners structure the evolvability assurance processes for microservices, what tools, metrics, and patterns they use, and what challenges they perceive for the evolvability of their systems.
Method
We first conducted 17 semi-structured interviews and discussed 14 different microservice-based systems and their assurance processes with software professionals from 10 companies. Afterwards, we performed a systematic grey literature review (GLR) and used the created interview coding system to analyze 295 practitioner online resources.
Results
The combined analysis revealed the importance of finding a sensible balance between decentralization and standardization. Guidelines like architectural principles were seen as valuable to ensure a base consistency for evolvability and specialized test automation was a prevalent theme. Source code quality was the primary target for the usage of tools and metrics for our interview participants, while testing tools and productivity metrics were the focus of our GLR resources. In both studies, practitioners did not mention architectural or service-oriented tools and metrics, even though the most crucial challenges like Service Cutting or Microservices Integration were of an architectural nature.
Conclusions
Practitioners relied on guidelines, standardization, or patterns like Event-Driven Messaging to partially address some reported evolvability challenges. However, specialized techniques, tools, and metrics are needed to support industry with the continuous evaluation of service granularity and dependencies. Future microservices research in the areas of maintenance, evolution, and technical debt should take our findings and the reported industry sentiments into account.
The article analyzes experimentally and theoretically the influence of microscope parameters on the pinhole-assisted Raman depth profiles in uniform and composite refractive media. The main objective is the reliable mapping of deep sample regions. The easiest to interpret results are found with low magnification, low aperture, and small pinholes. Here, the intensities and shapes of the Raman signals are independent of the location of the emitter relative to the sample surface. Theoretically, the results can be well described with a simple analytical equation containing the axial depth resolution of the microscope and the position of the emitter. The lower determinable object size is limited to 2–4 μm. If sub-micrometer resolution is desired, high magnification, mostly combined with high aperture, becomes necessary. The signal intensities and shapes depend now in refractive media on the position relative to the sample surface. This aspect is investigated on a number of uniform and stacked polymer layers, 2–160 μm thick, with the best available transparency. The experimental depth profiles are numerically fitted with excellent accuracy by inserting a Gaussian excitation beam of variable waist and fill fraction through the focusing lens area, and by treating the Raman emission with geometric optics as spontaneous isotropic process through the lens and the variable pinhole, respectively. The intersectional area of these two solid angles yields the leading factor in understanding confocal (pinhole-assisted) Raman depth profiles.
The increasing urban population growth leads to challenges in cities in many aspects: Urbanisation problems such as excessive environmental pollution or increasing urban traffic demand new and innovative solutions. In this context, the concept of smart cities is discussed. An enabling element of the smart city concept is applying information technology (IT) to improve administrative efficiency and quality of life while reducing costs and resource consumption and ensuring greater citizen participation in administrative and urban development issues. While these smart city services are technologically studied and implemented, government officials, citizens or businesses are often unaware of the large variety of smart city service solutions. Therefore, this work deals with developing a smart city services catalogue that documents best practice services to create a platform that brings citizens, city government, and businesses together. Although the concept of IT service catalogues is not new and guidelines and recommendations for the design and development of service catalogues already exist in the corporate context, there is little work on smart city service catalogues. Therefore, approaches from agile software development and pattern research were adapted to develop the smart city service catalogue platform in this work.
Die Bereitstellung klinischer Informationen im Operationssaal ist ein wichtiger Aspekt zur Unterstützung des chirurgischen Teams. Die roboter-assistierte Ösophagusresektion ist ein besonders komplexer Eingriff, der Potenzial zur workflowbasierten Unterstützung bietet. Wir präsentieren erste Ergebnisse der Entwicklung eines Checklisten-Tools mit der zugrundeliegenden Modellierung des chirurgischen Workflows und Informationsbedarf der Chirurgen. Das Checklisten-Tool zeigt hierfür die durchzuführenden Schritte chronologisch an und stellt zusätzliche Informationen kontextadaptiert bereit. Eine automatische Dokumentation von Start- und Endzeiten einzelner OP-Phasen und Schritte soll zukünftige Prozessanalysen der Operation ermöglichen.
Imagine a world in which the search for tomorrow's trends of (software) products is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
Adaptation of the business model canvas template to develop business models for the circular economy
(2021)
The Business Model Canvas as a template for strategic management serves the development of new or the documentation of existing linear business models. However, the change towards a Circular Economy requires new value creation structures and thus changed business models. To develop business models for circular economies, it is necessary to adapt the existing template, since the actors involved along the value chain take on changed roles. In the context of this paper, a template is presented, based on the existing Business Model Canvas, which allows to develop and document business models for a Circular Economy.
The paper explains a workflow to simulate the food energy water (FEW) nexus for an urban district combining various data sources like 3D city models, particularly the City Geography Markup Language (CityGML) data model from the Open Geospatial Consortium, Open StreetMap and Census data. A long term vision is to extend the CityGML data model by developing a FEW Application Domain Extension (FEW ADE) to support future FEW simulation workflows such as the one explained in this paper. Together with the mentioned simulation workflow, this paper also identifies some necessary FEW related parameters for the future development of a FEW ADE. Furthermore, relevant key performance indicators are investigated, and the relevant datasets necessary to calculate these indicators are studied. Finally, different calculations are performed for the downtown borough Ville-Marie in the city of Montréal (Canada) for the domains of food waste (FW) and wastewater (WW) generation. For this study, a workflow is developed to calculate the energy generation from anaerobic digestion of FW and WW. In the first step, the data collection and preparation was done. Here relevant data for georeferencing, data for model set-up, and data for creating the required usage libraries, like food waste and wastewater generation per person, were collected. The next step was the data integration and calculation of the relevant parameters, and lastly, the results were visualized for analysis purposes. As a use case to support such calculations, the CityGML level of detail two model of Montréal is enriched with information such as building functions and building usages from OpenStreetMap. The calculation of the total residents based on the CityGML model as the main input for Ville-Marie results in a population of 72,606. The statistical value for 2016 was 89,170, which corresponds to a deviation of 15.3%. The energy recovery potential of FW is about 24,024 GJ/year, and that of wastewater is about 1,629 GJ/year, adding up to 25,653 GJ/year. Relating values to the calculated number of inhabitants in Ville-Marie results in 330.9 kWh/year for FW and 22.4 kWh/year for wastewater, respectively.
A laboratory prototype for hyperspectral imaging in ultra-violet (UV) region from 225 to 400 nm was developed and used to rapidly characterize active pharmaceutical ingredients (API) in tablets. The APIs are ibuprofen (IBU), acetylsalicylic acid (ASA) and paracetamol (PAR). Two sample sets were used for a comparison purpose. Sample set one comprises tablets of 100% API and sample set two consists of commercially available painkiller tablets. Reference measurements were performed on the pure APIs in liquid solutions (transmission) and in solid phase (reflection) using a commercial UV spectrometer. The spectroscopic part of the prototype is based on a pushbroom imager that contains a spectrograph and charge-coupled device (CCD) camera. The tablets were scanned on a conveyor belt that is positioned inside a tunnel made of polytetrafluoroethylene (PTFE) in order to increase the homogeneity of illumination at the sample position. Principal component analysis (PCA) was used to differentiate the hyperspectral data of the drug samples. The first two PCs are sufficient to completely separate all samples. The rugged design of the prototype opens new possibilities for further development of this technique towards real large-scale application.
Hypericin has large potential in modern medicine and exhibits fascinating structural dynamics, such as multiple conformations and tautomerization. However, it is difficult to study individual conformers/tautomers, as they cannot be isolated due to the similarity of their chemical and physical properties. An approach to overcome this difficulty is to combine single molecule experiments with theoretical studies. Time-dependent density functional theory (TD-DFT) calculations reveal that tautomerization of hypericin occurs via a two-step proton transfer with an energy barrier of 1.63 eV, whereas a direct single-step pathway has a large activation energy barrier of 2.42 eV. Tautomerization in hypericin is accompanied by reorientation of the transition dipole moment, which can be directly observed by fluorescence intensity fluctuations. Quantitative tautomerization residence times can be obtained from the autocorrelation of the temporal emission behavior revealing that hypericin stays in the same tautomeric state for several seconds, which can be influenced by the embedding matrix. Furthermore, replacing hydrogen with deuterium further proves that the underlying process is based on tunneling of a proton. In addition, the tautomerization rate can be influenced by a λ/2 Fabry–Pérot microcavity, where the occupation of Raman active vibrations can alter the tunneling rate.
Teaching at assembly workstations in production in SMEs (small and medium sized companies) often does not take place at all or only insufficiently. In addition to the lack of technical content, there are also aggravatingly incorrect movement sequences from an ergonomic point of view, which "untrained" people usually automatically acquire. An AI based approach is used to analyze a definite workflow for a specific assembly scope regarding the behavior of several employees. Based on these different behaviors, the AI gives feedback at which points in time, work steps and movement’s particularly dangerous incorrect postures occur. Motion capturing and digital human model simulation in combination with the results of the AI define the optimized workflow. Individual employees can be trained directly due to the fact that AI identifies their most serious incorrect postures and provide them with a direct analogy of their “wrong” posture and “easy on the joints posture”. With the assistance of various test persons, the AI can conduct a study in which the most frequently occurring incorrect postures can be identified. This could be realized in general or tailored to specific groups of people (e.g. "People over 1.90m tall must be particularly careful not to make the following mistake...). The approach will be tested and validated at the Werk150, the factory of the ESB Business School, on the campus of the Reutlingen University. The new gained knowledge will be used subsequently for training in SMEs.
Entrepreneurial software engineering: towards a hybrid development method for early-stage startups
(2021)
A considerable share of innovative software-intensive products is developed by startups. However, product development in an early-stage startup is not a sequential process. A business idea is usually based on a number of assumptions. The riskiest assumptions need to be tested. Depending on the test results, a product strategy may change several times. This raises the question of how to create sufficiently stable software using engineering principles despite a dynamic product strategy that is subject to many uncertainties. Hybrid development methods that combine agile aspects with classical engineering methods seem to be a good choice in such a start-up context. This paper proposes a lightweight hybrid development method that provides early-stage startups with a framework to support the development of single-feature minimum viable products. The method was derived from a start-up company's founding case and evaluated in expert interviews. The proposed method is intended to provide a basis for discussion between practitioners and scientists with the aim of better understanding the application of software engineering principles in software start-ups.
Effektives Risiko-Management sollte neben quantifizierbaren, bekannten Risiken auch Ereignisse berücksichtigen, die entweder in ähnlicher Art bereits eingetreten oder grundsätzlich vorstellbar sind. Für eine Identifikation dieser "Grauen Schwäne" müssen institutionell-organisatorische Voraussetzungen geschaffen und analytisch-konzeptionelle Instrumente bereitgestellt werden.
Purpose: To develop a method for synthesizing a fuzzy automatic control system for a shearer drum in terms of coal seam hypsometry basing on the information criterion of the beginning of rock cutting-off by the drum to reduce ash content of the extracted coal.
Methodology: Taking into consideration peculiarities of determining a distinct information criterion of the beginning of rock cutting-off by the drum and regularities of its variations during the shearer operation, a fuzzy inference algorithm is developed for a system of fuzzy automatic drum control in terms of seam hypsometry. In this context, rules of fuzzy productions, parameters of the membership functions of terms of the output linguistic variable system, and fuzzy operations are substantiated according to the recommendations of a classic Mamdani fuzzy inference algorithm. Studies are carried out to analyze the effi¬ciency of the proposed fuzzy inference algorithm basing on the introduced relative parameter of the number of effective control actions formed by the fuzzy control system. Simulation modeling makes it possible to perform comparative analysis of the efficiency of the drum control.
Findings: In the course of research, an algorithm of fuzzy control of the shearer’s upper drum in terms of coal seam hypsometry has been developed basing on the determination of direct and inverse transfer from coal breaking near the seam roof by the shearer drum to rock breaking with the help of statistical analysis of the stator power of a cutting drive motor.
Originality: For the first time, a method of synthesis of fuzzy automatic control of the drum in terms of seam hypsometry has been proposed.
Practical value: The proposed method is the theoretical basis to solve important scientific and applied problem of the automation of the coal shearer drum in terms of seam hypsometry to reduce ash content of the produced coal.
Bedarfe gezielt erheben
(2021)
Hearing contact lens (HCL) is a new type of hearing aid devices. One of its main components is a piezo-electric actuator (PEA). In order to evaluate and maximizethe HCL´s performance, a model of the HCL coupled to the middle ear was developed using finite element (FE)approach. To validate the model, vibrational measurements on the HCL and temporal bones were performed using a Laser-Doppler-Vibrometer (LDV). The model was validated step by step starting with HCL only. Then a silicone cap was fitted onto the HCL to provide an interface between the HCL and the tympanic membrane. The HCL was placed on the tympanic membrane and additional measurements were performed to validate the coupled model. The model was used to evaluate the sensitivity of geometrical and material parameters with respect to performance measures of the HCL. Moreover, deeper insight was gained into the feedback behavior, which causes whistling sounds, and the contact between the HCL and tympanic membrane.
Trotz Niedrigzinsphase bleibt das Working Capital Management ein wichtiger Treiber für Wertgrößen in Unternehmen und wichtiges Managementinstrument. Unsere Ergebnisse über 115 Unternehmen aus den wichtigsten deutschen Indizes in den Jahren 2011 bis 2017 zeigen, dass effektives Working Capital Management einen positiven Einfluss auf die Rentabilität und den Unternehmenswert haben kann. Gleichzeitig zeigen unsere Ergebnisse aber auch, dass dem Working Capital Management jüngst weniger Aufmerksamkeit zuteilgeworden ist und digitale Innovationen vermutlich noch nicht in dem Umfang zur Effizienzsteigerung eingesetzt werden, wie dies möglich erscheint. Selbst vor dem Hintergrund andauernd niedriger Kapitalmarktzinsen ist dies kritisch zu sehen.
The digitization of factories will be a significant issue for the 2020s. New scenarios are emerging to increase the efficiency of production lines inside the factory, based on a new generation of robots’ collaborative functions. Manufacturers are moving towards data-driven ecosystems by leveraging product lifecycle data from connected goods. Energy-efficient communication schemes, as well as scalable data analytics, will support these various data collection scenarios. With augmented reality, new remote services are emerging that facilitate the efficient sharing of knowledge in the factory. Future communication solutions should generally ensure connectivity between the various production sites spread worldwide and new players in the value chain (e.g., suppliers, logistics) transparent, real-time, and secure. Industry 4.0 brings more intelligence and flexibility to production. Resulting in more lightweight equipment and, thus, offering better ergonomics. 5G will guarantee real-time transmissions with latencies of less than 1 ms. This will provide manufacturers with new possibilities to collect data and trigger actions automatically.
Theory and practice of implementing a successful enterprise IoT strategy in the industry 4.0 era
(2021)
Since the arrival of the internet and affordable access to technologies, digital technologies have occupied a growing place in industries, propelling us towards a 4th industrial revolution: Industry 4.0. In today’s era of digital upheaval, enterprises are increasingly undergoing transformations that are leading to their digitalization. The traditional manufacturing industry is in the throes of a digital transformation that is accelerated by exponentially growing technologies (e.g., intelligent robots, Internet of Things, sensors, 3D printing). Around the world, enterprises are in a frantic race to implement solutions based on IoT to improve their productivity, innovation, and reduce costs and improve their markets on the international scene. Considering the immense transformative potential that IoTs and big data have to bring to the industrial sector, the adoption of IoT in all industrial systems is a challenge to remain competitive and thus transform the industry into a smart factory. This paper presents the description of the innovation and digitalization process, following the Industry 4.0 paradigm to implement a successful enterprise IoT strategy.
Um den Übergang von der Schule zur Hochschule zu erleichtern, brauchen Studierende technischer Fächer häufig eine Auffrischung ihrer Kenntnisse in Mathematik und Physik. Ein Online-Lernsystem für Physik kann Studierende bei der Beschäftigung mit physikalischen Inhalten unterstützen. Zudem kann ein Physik-Wissenstest Lücken im individuellen Wissensstand aufzeigen und zum Lernen der fehlenden Themen motivieren. Die Arbeitsgruppe "eLearning in der Physik" der Hochschulföderation Süd-West (HfSW) bestehend aus den baden-württembergischen Hochschulen Aalen, Esslingen, Heilbronn, Mannheim und Reutlingen hat einen Aufgabenpool von über 200 Physikaufgaben für Erstsemester erarbeitet. Sie stehen den Studierenden mit Lösungen in Lernmanagementsystemen zum Selbststudium und jetzt auch im "Zentralen Open Educational Resources Repositorium der Hochschulen in Baden-Württemberg" (ZOERR) zur Verfügung. In diesem Beitrag wird über den Einsatz der Online-Übungsaufgaben in 2020/2021 berichtet, über die Ergebnisse der Wissenstests und über die in der Corona-Zeit neu eingerichteten eTutorien.
Highly viscous bioinks offer great advantages for the three-dimensional fabrication of cell-laden constructs by microextrusion printing. However, no standardised method of mixing a high viscosity biomaterial ink and a cell suspension has been established so far, leading to non-reproducible printing results. A novel method for the homogeneous and reproducible mixing of the two components using a mixing unit connecting two syringes is developed and investigated. Several static mixing units, based on established mixing designs, were adapted and their functionality was determined by analysing specific features of the resulting bioink. As a model system, we selected a highly viscous ink consisting of fresh frozen human blood plasma, alginate, and methylcellulose, and a cell suspension containing immortalized human mesenchymal stem cells. This bioink is crosslinked after fabrication. A pre-crosslinked gellan gum-based bioink providing a different extrusion behaviour was introduced to validate the conclusions drawn from the model system. For characterisation, bioink from different zones within the mixing device was analysed by measurement of its viscosity, shape fidelity after printing and visual homogeneity. When taking all three parameters into account, a comprehensive and reliable comparison of the mixing quality was possible. In comparison to the established method of manual mixing inside a beaker using a spatula, a significantly higher proportion of viable cells was detected directly after mixing and plotting for both bioinks when the mixing unit was used. A screw-like mixing unit, termed “HighVisc”, was found to result in a homogenous bioink after a low number of mixing cycles while achieving high cell viability rates.
Facial beauty prediction (FBP) aims to develop a machine that automatically makes facial attractiveness assessment. In the past those results were highly correlated with human ratings, therefore also with their bias in annotating. As artificial intelligence can have racist and discriminatory tendencies, the cause of skews in the data must be identified. Development of training data and AI algorithms that are robust against biased information is a new challenge for scientists. As aesthetic judgement usually is biased, we want to take it one step further and propose an Unbiased Convolutional Neural Network for FBP. While it is possible to create network models that can rate attractiveness of faces on a high level, from an ethical point of view, it is equally important to make sure the model is unbiased. In this work, we introduce AestheticNet, a state-of-the-art attractiveness prediction network, which significantly outperforms competitors with a Pearson Correlation of 0.9601. Additionally, we propose a new approach for generating a bias-free CNN to improve fairness in machine learning.
We examine the role of communication from users on dropout from digital learning systems to answer the following questions: (1) how does the sentiment within qualitative signals (user comments) affect dropout rates? (2) does the variance in the proportion of positive and negative sentiments affect dropout rates? (3) how do quantitative signals (e.g. likes) moderate the effect of the qualitative signals? and (4) how does the effect of qualitative signals on dropout rates change across early and late stages of learning? Our hypotheses draws from learning theory and self-regulation theory, and were tested using data of 447 learning videos across 32 series of online tutorials, spanning 12 different fields of learning. The findings indicate a main effect of negative sentiment on dropout rates but no effect of positive sentiment on preventing dropout behaviour. This main effect is stronger in the early stages of learning and weakens at later stages. We also observe an effect of the extent of variance of positive and negative sentiments on dropout behaviour. The effects are negatively moderated by quantitative signals. Overall, making commenting more broad-based rather than polarised can be a useful strategy in managing learning, transferring knowledge, and building consensus.
Die zunehmende Technologie- und Produktkomplexität führen dazu, dass sich immer mehr Unternehmen für ihre F&E mit externen Organisationen vernetzen. So entstehen interorganisationale F&E-Projekte, welche temporäre Organisationen darstellen. Forschungsfragen zu diesen Projekten sind u.a. hinsichtlich der Praktiken und Verhaltensregeln offen. Über ein kulturbewusstes Projektmanagement können kooperations- und innovationsförderliche Praktiken und Verhaltensregeln aufgebaut werden, die für diese F&E-Projekte essenziell sind. So ist die Forschungsfrage dieses Beitrags, wie ein projektkulturbewusstes Management interorganisationaler F&E-Projekte erfolgen kann. Dafür wird auf Basis der theoretischen Grundlagen zum F&E-Projektmanagement, zu menschlichen Handlungssystemen und Ebenen der Zusammenarbeit, zu Kultur und Verhalten ein projektkulturbewusstes Management-Modell entwickelt. Das Modell umfasst zwei Teile. Im ersten Teil wird der Bereich aufgezeigt, in welchem sich die Projektkultur entwickelt. Im zweiten Teil wird aufgezeigt, wie die Faktoren für ein wahrscheinlich kooperatives und innovatives Verhalten innerhalb dieses Bereiches gestaltet werden sollten.
This paper takes a holistic view on an IP-traceability process in interorganizational R&D projects, as a particular Open innovation mode, aiming at showing different technologies which can be used in the front and backend of a traceability process and discussing these technologies in terms of their suitability for data from creativity processes in these projects. To achieve this goal a two-stage literature review on different technologies in the context of traceability was conducted. Then, criteria were derived from the characteristics of data from creativity processes and of interorganizational R&D projects, with which the resulting technologies were discussed. At the end, recommendations regarding suitable technologies for tracing individual creativity artifacts in interorganizational R&D projects were given.
Supply chains have become increasingly complex, making it difficult to ensure transparency throughout the whole supply chain. In this context, first approaches came up, adopting the immutable, decentralised, and secure characteristics of the blockchain technology to increase the transparency, security, authenticity, and auditability of assets in supply chains. This paper investigates recent publications combining the blockchain technology and supply chain management and classifies them regarding the complexity to be mapped on the blockchain. As a result, the increase of supply chain transparency is identified as the main objective of recent blockchain projects in supply chain management. Thereby, most of the recent publications deal with simple supply chains and products. The few approaches dealing with complex parts only map sub-areas of supply chains. Currently no example exists which has the aim of increasing the transparency of complex manufacturing supply chains, and which enables the mapping of complex assembly processes, an efficient auditability of all assets, and an implementation of dynamic adjustments.
Classification model of supply chain events regarding their transferability to blockchain technology
(2021)
The blockchain technology represents a decentralized database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain supply chain events to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. However, the decentralized and immutable structure of blockchain technology also creates challenges. In particular, the scalability, storage capacity, and the special requirements for storage formats make it currently impossible to map all supply chain events unrestrictedly on the blockchain. As a first step, this paper identifies important supply chain events for different use cases combining blockchain technology and supply chain management. Secondly, the supply chain events are classified in terms of their expected technical properties and their relevance for the respective use case. Finally, the identified supply chain events are evaluated regarding their transferability to blockchain technology and a classification model is introduced.
Distributed ledger technologies such as the blockchain technology offer an innovative solution to increase visibility and security to reduce supply chain risks. This paper proposes a solution to increase the transparency and auditability of manufactured products in collaborative networks by adopting smart contract-based virtual identities. Compared with existing approaches, this extended smart contract-based solution offers manufacturing networks the possibility of involving privacy, content updating, and portability approaches to smart contracts. As a result, the solution is suitable for the dynamic administration of complex supply chains.
Porous silica materials are often used for drug delivery. However, systems for simultaneous delivery of multiple drugs are scarce. Here we show that anisotropic and amphiphilic dumbbell core–shell silica microparticles with chemically selective environments can entrap and release two drugs simultaneously. The dumbbells consist of a large dense lobe and a smaller hollow hemisphere. Electron microscopy images show that the shells of both parts have mesoporous channels. In a simple etching process, the properly adjusted stirring speed and the application of ammonium fluoride as etching agent determine the shape and the surface anisotropy of the particles. The surface of the dense lobe and the small hemisphere differ in their zeta potentials consistent with differences in dye and drug entrapment. Confocal Raman microscopy and spectroscopy show that the two polyphenols curcumin (Cur) and quercetin (QT) accumulate in different compartments of the particles. The overall drug entrapment efficiency of Cur plus QT is high for the amphiphilic particles but differs widely between Cur and QT compared to controls of core–shell silica microspheres and uniformly charged dumbbell microparticles. Furthermore, Cur and QT loaded microparticles show different cancer cell inhibitory activities. The highest activity is detected for the dual drug loaded amphiphilic microparticles in comparison to the controls. In the long term, amphiphilic particles may open up new strategies for drug delivery.
Energy efficiency optimization techniques for steady state operation of induction machines are the state-of-the-art, and many methods have already been developed. However, many real-world industrial and electric vehicle applications cannot be considered to be in steady state operation. The focus of this contribution is on the efficiency optimization of induction machines in dynamic operation. Online dynamic operation is challenging due to the computational complexity and the required low sample times in an inverter. An offline optimization is therefore conducted to gain knowledge. Based on this offline optimal solution, a simple and easy to implement template based solution is developed. This approach aims at replicating the solution found by the offline optimization by resembling the shape and anticipative characteristics of the optimal flux trajectory. The energy efficiency improvement of the template based solution is verified by simulations and measurements on a test bench and using a real-world drive cycle scenario. For comparison, a model predictive numerical online optimization is investigated too.
This paper studies the power of online search intensity metrics, measured by Google, for examining and forecasting exchange rates. We use panel data consisting of quarterly time series from 2004 to 2018 and ten international countries with the highest currency trading volume. Newly, we include various Google search intensity metrics to our panel data. We find that online search improves the overall econometric models and fits. First, four out of ten search variables are robustly significant at one percent and enhance the macroeconomic exchange rate models. Second, country regressions corroborate the panel results, yet the predictive power of search intensity with regard to exchange rates vary by country. Third, we find higher prediction performance for our exchange rate models with search intensity, particularly in regard to the direction of the exchange rate. Overall, our approach reveals a value-added of search intensity in exchange rate models.
Mit diesem Strategiepapier formulieren die Universitäts-, Landes- und Hochschulbibliotheken des Landes Baden-Württemberg die aus ihrer Sicht zentralen Entwicklungsfelder und Herausforderungen der kommenden Jahre. Die Bibliotheken und das Bibliotheksservice-Zentrum Baden-Württemberg (BSZ) sorgen als Wissenschafts- und Kultureinrichtungen gemeinsam für die akademische Informationsinfrastruktur. Sie nehmen die Herausforderungen der Digitalisierung an und gestalten den Wandel im Dialog mit Forschenden, Lehrenden und Studierenden aktiv mit.
In this paper we describe an interactive web-based tool for visual analysis of Formula 1 data. A calendar-like representation provides an overview of all races on a yearly basis, either in absolute or normalized time. After selecting a dedicated race more details about this race can be explored. Furthermore it is possible to compare up to three different races. Beside visualizing details on dedicated races it is also possible to analyse driver and team performance over time. A user study was applied to get feedback about the usage of the application and decide between different visualization options.