Refine
Year of publication
- 2022 (233) (remove)
Document Type
- Journal article (123)
- Conference proceeding (83)
- Book chapter (15)
- Working Paper (4)
- Doctoral Thesis (3)
- Patent / Standard / Guidelines (2)
- Report (2)
- Book (1)
Has full text
- yes (233) (remove)
Is part of the Bibliography
- yes (233)
Institute
- ESB Business School (83)
- Informatik (71)
- Technik (34)
- Life Sciences (30)
- Texoversum (11)
- Zentrale Einrichtungen (4)
Publisher
- Springer (30)
- Elsevier (27)
- IEEE (21)
- MDPI (21)
- Hochschule Reutlingen (13)
- Center for Promoting Education and Research (6)
- Wiley (5)
- Association for Computing Machinery (4)
- De Gruyter (4)
- Emerald (4)
Due to its wide-ranging endocrine functions, adipose tissue influences the whole body’s metabolism. Engineering long-term stable and functional human adipose tissue is still challenging due to the limited availability of suitable biomaterials and adequate cell maturation. We used gellan gum (GG) to create manual and bioprinted adipose tissue models because of its similarities to the native extracellular matrix and its easily tunable properties. Gellan gum itself was neither toxic nor monocyte activating. The resulting hydrogels exhibited suitable viscoelastic properties for soft tissues and were stable for 98 days in vitro. Encapsulated human primary adipose-derived stem cells (ASCs) were adipogenically differentiated for 14 days and matured for an additional 84 days. Live-dead staining showed that encapsulated cells stayed viable until day 98, while intracellular lipid staining showed an increase over time and a differentiation rate of 76% between days 28 and 56. After 4 weeks of culture, adipocytes had a univacuolar morphology, expressed perilipin A, and secreted up to 73% more leptin. After bioprinting establishment, we demonstrated that the cells in printed hydrogels had high cell viability and exhibited an adipogenic phenotype and function. In summary, GG-based adipose tissue models show long-term stability and allow ASCs maturation into functional, univacuolar adipocytes.
Adipose tissue is related to the development and manifestation of multiple diseases, demonstrating the importance of suitable in vitro models for research purposes. In this study, adipose tissue lobuli were explanted, cultured, and used as an adipose tissue control to evaluate in vitro generated adipose tissue models. During culture, lobule exhibited a stable weight, lactate dehydrogenase, and glycerol release over 15 days. For building up in vitro adipose tissue models, we adapted the biomaterial gelatin methacryloyl (GelMA) composition and handling to homogeneously mix and bioprint human primary mature adipocytes (MA) and adipose-derived stem cells (ASCs), respectively. Accelerated cooling of the bioink turned out to be essential for the homogeneous distribution of lipid-filled MAs in the hydrogel. Last, we compared manual and bioprinted GelMA hydrogels with MA or ASCs and the explanted lobules to evaluate the impact of the printing process and rate the models concerning the physiological reference. The viability analyses demonstrated no significant difference between the groups due to additive manufacturing. The staining of intracellular lipids and perilipin A suggest that GelMA is well suited for ASCs and MA. Therefore, we successfully constructed physiological in vitro models by bioprinting MA-containing GelMA bioinks.
Characterization of low density polyethylene greenhouse films during the composting of rose residues
(2022)
This study presents an evaluation of a potential alternative to plastic degradation in the form of organic composting. It stems from the urgent need of finding solutions to the plastic residues and focuses on the compost-based degradation of greenhouse film covers in an important rose exporter company in Ecuador. Thus, this study analyzes the physical, chemical, and biological changes of rose wastes composting, and also evaluates the stability of new and aged agricultural plastic under these conditions. Interestingly, results of compost characterization show a slow degradation rate of organic matter and total organic carbon, along with a significant increase in pH and rise of bacterial populations. However, the results demonstrate that despite these findings, composting conditions had no significant influence on plastic degradation, and while deterioration of aged plastic samples was reported in some tests, it may be the result of environmental conditions and a prolonged exposure to solar radiation. Importantly, these factors could facilitate the adhesion of microorganisms and promote plastic biodegradation. Hence, it is encouraged for future studies to analyze the ecotoxicity of plastics in the compost, as well as isolate, identify, and evaluate the possible biodegradative potential of these microorganisms as an alternative to plastic waste management.
In times of climate change and growing urbanization, the way food is produced and consumed also changes. Meanwhile, digitization is transforming farming practices, which also applies to the domestic growing of crops. More and more so-called smart home farms (SHF) are finding their way into private households. This paper conceptualizes the unique nature of enabled smart services and their underlying technology. Following an inductive interpretive approach, this study explores the antecedents of smart home farming practices. Our sample consists of eleven actual smart home farmers. We found six constructs to be of salient importance: expected outcomes related to harvesting, positive feelings, and sustainability; a combination of one's affinity for green and novel technologies; and the smartness and visibility of the enabled services. In the outlook, we present some preliminary thoughts for testing our qualitative findings.
The proper selection of a demand forecasting method is directly linked to the success of supply chain management (SCM). However, today’s manufacturing companies are confronted with uncertain and dynamic markets. Consequently, classical statistical methods are not always appropriate for accurate and reliable forecasting. Algorithms of Artificial intelligence (AI) are currently used to improve statistical methods. Existing literature only gives a very general overview of the AI methods used in combination with demand forecasting. This paper provides an analysis of the AI methods published in the last five years (2017-2021). Furthermore, a classification is presented by clustering the AI methods in order to define the trend of the methods applied. Finally, a classification of the different AI methods according to the dimensionality of data, volume of data, and time horizon of the forecast is presented. The goal is to support the selection of the appropriate AI method to optimize demand forecasting.
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
We study whether compulsory religious education in schools affects students' religiosity as adults. We exploit the staggered termination of compulsory religious education across German states in models with state and cohort fixed effects. Using three different datasets, we find that abolishing compulsory religious education significantly reduced religiosity of affected students in adulthood. It also reduced the religious actions of personal prayer, church-going, and church membership. Beyond religious attitudes, the reform led to more equalized gender roles, fewer marriages and children, and higher labor-market participation and earnings. The reform did not affect ethical and political values or non-religious school outcomes.
We study whether compulsory religious education in schools affects students' religiosity as adults. We exploit the staggered termination of compulsory religious education across German states in models with state and cohort fixed effects. Using three different datasets, we find that abolishing compulsory religious education significantly reduced religiosity of affected students in adulthood. It also reduced the religious actions of personal prayer, church-going, and church membership. Beyond religious attitudes, the reform led to more equalized gender roles, fewer marriages and children, and higher labor-market participation and earnings. The reform did not affect ethical and political values or non-religious school outcomes.
Being exposed to compulsory religious education in school can have long-run consequences for students’ lives. At different points in time since the 1970s, German states terminated compulsory religious education in public schools and replaced it by a choice between ethics classes and religious education. This article shows that the reform not only led to reduced religiosity in students’ later life, but also eroded traditional attitudes towards gender roles and increased labor-market participation and earnings.
Nowadays, the importance of early active patient mobilization in the recovery and rehabilitation phase has increased significantly. One way to involve patients in the treatment is a gamification-like approach, which is one of the methods of motivation in various life processes. This article shows a system prototype for patients who require physical activity because of active early mobilization after medical interventions or during illness. Bedridden patients and people with a sedentary lifestyle (predominantly lying in bed) are also potential users. The main idea for the concept was non-contact system implementation for the patients making them feel effortless during its usage. The system consists of three related parts: hardware, software, and game application. To test the relevance and coherence of the system, it was used by 35 people. The participants were asked to play a video game requiring them to make body movements while lying down. Then they were asked to take part in a small survey to evaluate the system's usability. As a result, we offer a prototype consisting of hardware and software parts that can increase and diversify physical activity during active early mobilization of patients and prevent the occurrence of possible health problems due to predominantly low activity. The proposed design can be possibly implemented in hospitals, rehabilitation centers, and even at home.
Sleep analysis using a Polysomnography system is difficult and expensive. That is why we suggest a non-invasive and unobtrusive measurement. Very few people want the cables or devices attached to their bodies during sleep. The proposed approach is to implement a monitoring system, so the subject is not bothered. As a result, the idea is a non-invasive monitoring system based on detecting pressure distribution. This system should be able to measure the pressure differences that occur during a single heartbeat and during breathing through the mattress. The system consists of two blocks signal acquisition and signal processing. This whole technology should be economical to be affordable enough for every user. As a result, preprocessed data is obtained for further detailed analysis using different filters for heartbeat and respiration detection. In the initial stage of filtration, Butterworth filters are used.
With the progress of technology in modern hospitals, an intelligent perioperative situation recognition will gain more relevance due to its potential to substantially improve surgical workflows by providing situation knowledge in real-time. Such knowledge can be extracted from image data by machine learning techniques but poses a privacy threat to the staff’s and patients’ personal data. De-identification is a possible solution for removing visual sensitive information. In this work, we developed a YOLO v3 based prototype to detect sensitive areas in the image in real-time. These are then deidentified using common image obfuscation techniques. Our approach shows that it is principle suitable for de-identifying sensitive data in OR images and contributes to a privacyrespectful way of processing in the context of situation recognition in the OR.
In our initial DaMoN paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” (Yu in Proc. VLDB Endow 8: 209-220, 2014). Against their assumption, today we do not see single-socket CPUs with 1000 cores. Instead, multi-socket hardware is prevalent today and in fact offers over 1000 cores. Hence, we evaluated concurrency control (CC) schemes on a real (Intel-based) multi-socket platform. To our surprise, we made interesting findings opposing results of the original analysis that we discussed in our initial DaMoN paper. In this paper, we further broaden our analysis, detailing the effect of hardware and workload characteristics via additional real hardware platforms (IBM Power8 and 9) and the full TPC-C transaction mix. Among others, we identified clear connections between the performance of the CC schemes and hardware characteristics, especially concerning NUMA and CPU cache. Overall, we conclude that no CC scheme can efficiently make use of large multi-socket hardware in a robust manner and suggest several directions on how CC schemes and overall OLTP DBMS should evolve in future.
Glioblastoma WHO IV belongs to a group of brain tumors that are still incurable. A promising treatment approach applies photodynamic therapy (PDT) with hypericin as a photosensitizer. To generate a comprehensive understanding of the photosensitizer-tumor interactions, the first part of our study is focused on investigating the distribution and penetration behavior of hypericin in glioma cell spheroids by fluorescence microscopy. In the second part, fluorescence lifetime imaging microscopy (FLIM) was used to correlate fluorescence lifetime (FLT) changes of hypericin to environmental effects inside the spheroids. In this context, 3D tumor spheroids are an excellent model system since they consider 3D cell–cell interactions and the extracellular matrix is similar to tumors in vivo. Our analytical approach considers hypericin as probe molecule for FLIM and as photosensitizer for PDT at the same time, making it possible to directly draw conclusions of the state and location of the drug in a biological system. The knowledge of both state and location of hypericin makes a fundamental understanding of the impact of hypericin PDT in brain tumors possible. Following different incubation conditions, the hypericin distribution in peripheral and central cryosections of the spheroids were analyzed. Both fluorescence microscopy and FLIM revealed a hypericin gradient towards the spheroid core for short incubation periods or small concentrations. On the other hand, a homogeneous hypericin distribution is observed for long incubation times and high concentrations. Especially, the observed FLT change is crucial for the PDT efficiency, since the triplet yield, and hence the O2 activation, is directly proportional to the FLT. Based on the FLT increase inside spheroids, an incubation time 30 min is required to achieve most suitable conditions for an effective PDT.
Even though near-data processing (NDP) can provably reduce data transfers and increase performance, current NDP is solely utilized in read-only settings. Slow or tedious to implement synchronization and invalidation mechanisms between host and smart storage make NDP support for data-intensive update operations difficult. In this paper, we introduce a low-latency cache-coherent shared lock table for update NDP settings in disaggregated memory environments. It utilizes the novel CCIX interconnect technology and is integrated in neoDBMS, a near-data processing DBMS for smart storage. Our evaluation indicates end-to-end lock latencies of ∼80-100ns and robust performance under contention.
Multi-versioning and MVCC are the foundations of many modern DBMSs. Under mixed workloads and large datasets, the creation of the transactional snapshot can become very expensive, as long-running analytical transactions may request old versions, residing on cold storage, for reasons of transactional consistency. Furthermore, analytical queries operate on cold data, stored on slow persistent storage. Due to the poor data locality, snapshot creation may cause massive data transfers and thus lower performance. Given the current trend towards computational storage and near-data processing, it has become viable to perform such operations in-storage to reduce data transfers and improve scalability. neoDBMS is a DBMS designed for near-data processing and computational storage. In this paper, we demonstrate how neoDBMS performs snapshot computation in-situ. We showcase different interactive scenarios, where neoDBMS outperforms PostgreSQL 12 by up to 5×.
The functionality of existing cyber-physical production systems generally focuses on mapping technologic specifications derived from production requirements. Consequently, such systems base their conception on a structurally mechanistic paradigm. Insofar as these approaches have considered humans, their conception likewise is based on the structurally identical paradigm. Due to the fundamental reorientation towards explicitly human-centered approaches, the fact that essential aspects of the dimension "human" remain unconsidered by the previous paradigm becomes more and more apparent. To overcome such limitations, mapping the "social" dimension requires a structurally different approach. In this paper, an anthropocentric approach is developed based on possible conceptions of the human being, enabling a structural integration of the human being in an extended dimension. Through the model, extending concepts for better integration of the human being in the sense of human-centered approaches, as envisioned in the Industrie 5.0 conception, is possible.
Artificial intelligence is a field of research that is seen as a means of realization regarding digitalization and industry 4.0. It is considered as the critical technology needed to drive the future evolution of manufacturing systems. At the same time, autonomous guided vehicles (AGV) developed as an essential part due to the flexibility they contribute to the whole manufacturing process within manufacturing systems. However, there are still open challenges in the intelligent control of these vehicles on the factory floor. Especially when considering dynamic environments where resources should be controlled in such a way, that they can be adjusted to turbulences efficiently. Therefore, this paper aimed to develop a conceptual framework for addressing a catalog of criteria that considers several machine learning algorithms to find the optimal algorithm for the intelligent control of AGVs. By applying the developed framework, an algorithm is automatically selected that is most suitable for the current operation of the AGV in order to enable efficient control within the factory environment. In future work, this decision-making framework can be transferred to even more scenarios with multiple AGV systems, including internal communication along with AGV fleets. With this study, the automatic selection of the optimal machine learning algorithm for the AGV improves the performance in such a way, that computational power is distributed within a hybrid system linking the AGV and cloud storage in an efficient manner.
Modern production systems are characterized by the increasingly use of CPS and IoT networks. However, processing the available information for adaptation and reconfiguration often occurs in relatively large time cycles. It thus does not take advantage of the optimization potential available in the short term. In this paper, a concept is presented that, considering the process information of the individual heterogeneous system elements, detects optimization potentials and performs or proposes adaptation or reconfiguration. The concept is evaluated utilizing a case study in a learning factory. The resulting system thus enables better exploitation of the potentials of the CPPS.